It's time to make computing personal again
187 comments
·January 19, 2025bruce511
Animats
> one of the supplied examples showed any form of network effect. It was all stuff you did at home.
That's what's wrong with the various "federated" social networks. They lack a network effect that makes them grow.
bruce511
Well yes and no. There's no existing network (for something new), so certainly there's no network effect making them grow.
On the other hand the pitch to get people to join is weak. I don't pitch it to my friends because (currently) its a pretty poor experience compared to what they are already using.
bestham
Due to the way iOS apps are sandboxed together with their user created content a lot of users have video projects that are locked into CapCut without an easy way to access them following the ban of the TikTok suite of apps. Remind me how your iPhone is yours, when your creations on your device can be locked away from you.
Arcanum-XIII
Well, I have access in Files to a lot of content from my apps - that’s a decision of the app creator to not use this and keep the content created in the locked area of the app.
For example, the apps from Omni do this, as do obsidian, Linea…
Let’s assign the blame where it should be here.
badsectoracula
> Let’s assign the blame where it should be here.
Obviously the blame lies on Apple for locking away your device's contents from you. Developers should not be able to have more control over what you can access on your device than you do. Even if they make bad choices (like making accessing the files hard) it should be you who has the final say, not them.
Apple making it possible for developers to make bad choices and go against users' control over their own devices is to blame.
bruce511
An iPhone is a very non-typical device. Apple is a non-typical company which builds lock-in to every step of the process.
If you chose to use iAnything then it's a bit late to start complaining about lock in now.
lazide
When ‘not typical’ is actually the norm for a huge swath of users, perhaps non-typical is not the right term?
spencerflem
I agree- though I think the problem is more that the focus of attention is not on making personal computing better, so it's withered. And some programs you could get as a buy once works offline experience are now subscription based -as-a-service
gazchop
Extrapolating this point outward, I don't think there is really any community computing.
Most people I know literally still to use the lowest common denominator of communications because corporates have managed to screw up interoperability in their land grabs to build walled gardens. The lowest common denominator in my area is emailing word documents or PDFs around. Same as we have been doing for the last 30 years. The network effect there was Word being the first thing on the market.
All other attempts have been entirely transient and are focused in either social matters or some attempt at file storage with collaboration bolted on the top. The latter, OneDrive being a particularly funny one, generally results in people having millions of little pockets of exactly what they were doing before with no collaboration or community at all.
If we do anything now it's just personal computing with extra and annoying steps.
And no, 99% of the planet doesn't use github. They just email shitty documents around all day. Then they go back home and stare at their transient worthless social community garbage faucet endlessly until their eyes fall shut.
RachelF
Recently there's been another shift - processing power.
In the past you could do almost anything on a personal computer, it was generally about as fast as mainframe or high end workstation.
Training large AI models is currently impossible for most home users. They just do not have the processing power.
bruce511
I feel like the "past" is a shorter timeline for you than it is for me.
For all the examples mentioned in the parent article, PCs were significantly under-powered compared to workstations, much less main frames.
An explosion of hardware development between 2005 and 2020 has lead to an era where hardware outperformed software needs. Now software is catching up.
But there have always been use cases for high end hardware, and always will be.
ksec
Yes. Vast majority of computing is still under powered. Chromebook for example. Apple Silicon fanless MacBook Air only arrives in 2021. And I would argue if we want AR or latency sensitive applications our computing power is still off by at least an order of magnitude.
shwouchk
That’s not true in many domains where doing it on a personal computer would be either too long or too long in asfar as you are are skillful as using faster memory as cache.
video production, climate simulations, pdes, protein folding, etc etc
dsign
I agree with you; all of those needed vastly more computing than was available in a PC. If anything, the power of modern hardware has made a lot of it more available in personal workstations. Though it is true that hyped-for-the-masses personal computing devices are not optimized in that direction. You get what you buy.
interludead
The challenge is making community computing ethical
BrenBarn
Yeah, but I think part of the point is people don't actually want or need network effects for a lot of things. Even where connection is needed, companies have used it to wedge in stuff that doesn't benefit users.
jwr
This article made me even more sad than I already was. I've just been reading about Bambu Lab (a leading 3d printer manufacturer, who introduced really good 3d printers a couple of years ago and really shook up the entire market) self-destructing itself and burning through all the goodwill accumulated over the years. They are working on closing down access to their printers, apparently with the end goal of locked-down subscription-based access. This is much like the path that HP followed with their printers.
I also write this on a Mac, where I'm watching with sadness the formerly great company being run by bean-counters, who worry about profits, not user experience. The Mac is being progressively locked down and many things break in the process. Yes, it is still better than Windows, where apparently the start menu is just advertising space and the desktop isn't mine, but Microsoft's, but the path is definitely sloping down.
It's just sad.
I don't know what else to write. There isn't much we can do, as long as everybody tolerates this.
markus_zhang
I have the same fear as you do.
My prediction, is that, in the not too far future, perhaps 20-25 years, with the "blessing" of national security, ads business and other big players, devices will be further locked down and tracked, INCLUDING personal computers.
A lot of people already don't own a computer nowadays, except for the pocket one. In that future PCs, if they still exist, perhaps are either thin clients connecting to the vast National Net, where you can purchase subscriptions for entertainment, or completely locked down pads that ordinary people do not even have the tool to open properly. Oh, all "enhanced" with AI agents of course. You might even get a free one every 5 years -- part of your basic income package.
They won't make learning low level programming or hardware hacking illegal, because they are still valuable skills, and some people need to do that anyway. But for ordinary people it's going to be a LOT tougher. The official development languages of your computer system are some sort of Java and Javascript variants that are SAFE. You simply don't get exposed to the lower level. Very little system level API is going to be exposed because you won't have to know. If you have some issues, submit a ticket to the companies who program the systems.
We are already halfway there. Politicians and super riches are going to love that.
wegfawefgawefg
anybody who really wants to learn to code will just install linux on an old clunker. every uni and highschool student i know who means business does this.
markus_zhang
Yes I do that too but using VMs. I hope I'm just overthinking.
z3phyr
Single player video games are not going to die. And the market seems to punish any push for always online model (which is obviously a scam). I say this because a bulk of market for personal computing is driven by video games.
nntwozz
Hard disagree.
macOS has been very conservative in redesigning the user experience; it's aging slowly like fine wine. There are a few hiccups occasionally but I feel it's a lot more polished and elegant compared to the quirkiness of earlier versions. I don't get this common sentiment that it was better in Snow Leopard etc.
Stability is great, power consumption is awesome since the introduction of the M-series chips and I can still do everything (and more) that I did on my mac 20 years ago. Yes there are some more hoops here and there but overall you have to keep in mind that macOS never fell into the quagmire Windows did with bloatware, malware and all the viruses (although I think the situation is much better today).
macOS has been walking a fine balance between lockdown and freedom, there is no reason to be alarmist about it and there are still projects like OpenCore Legacy Patcher that prove you can still hack macOS plenty to make it work on older macs.
We're eating good as mac users in 2025, I don't buy the slippery slope fallacy.
jwr
There definitely is neglect and a slippery slope.
The new settings are half-baked and terrible. The OS pesters me constantly to update to the latest revision and I can't turn those messages off, not even close the notification without displaying the settings app. And I don't want the latest revision, because it is buggy, breaks a number of apps, and introduces the "AI" features that I do not want or need.
More and more often good apps get broken by the new OS policies (SuperDuper is a recent example).
The old style of system apps that did wonderful things is gone (think Quicktime player, or Preview), these apps are mostly neglected. The new style is half-baked multi-platform apps like settings, that do little, and what they do, they do poorly.
leidenfrost
Unlike your parent comment I do think that Mac favored lockdown all the way.
But it does a wonderful job at doing so.
Macs feel less like a personal computer and more like an appliance. Which works great if you do things that don't require tinkering, like office tasks or even webdev.
And I do love Linux, specially the more hobbyist OS's like Gentoo or Nix.
But at some point in my life I decided to spend more of my time (aside work) with other parts of my life. And in result, having to spend a weeken to solve some weird usecase, be it the package manager or the WM, is a pain.
noobermin
Linux exists.
I know the usual comments will crop up but, now if ever is the best chance to give it a try, at least as a semi daily driver if you still want to play games or such.
BrenBarn
I switched to Linux a couple years ago and overall am glad I did, but it's only a partial solution.
As I see it, one way to phrase the problem is that Linux (along with its ecosystem) isn't really user-focused either. It's developer-focused. Tons of weird decisions get made that are grounded in developer desires quite removed from any user concerns. There are a lot of developers out there that do care about users, so often you get something decent, but it's still a bit off-center.
vanviegen
Can you name some example(s) please?
Pannoniae
Linux definitely exists.... except that it isn't free from this philosophy either. From the "don't theme my apps" movement, to Wayland's "security above usability" philosophy... I recently even read about some kallsyms functions being unexported from an 5.x release because it could be used to lookup symbols and it shouldn't be that easy to access internal kernel symbols or something.
Not to mention many projects refusing to add configurability and accessibility, citing vague maintainibility concerns or ideological opposition.
Another blatant example is the 6.7 kernel merging anti-user "features" in AMDGPU... previously you could lower your power limits as much as you wanted, now you have to use a patched kernel to lower your PL below -10%...
Everywhere you go, you can find these user- and tinkerer-hostile decisions. Linux isn't much better than Windows for the semi-casual tinkerer either - at least on Windows you don't get told to just fork the project and implement it yourself.
I'm a bit hesitant to call this corporate greed as it's literally happening in the OSS ecosystem too. Sadly I don't have answers why, only more questions. No idea what happened.
jwr
I should probably have a pre-defined disclaimer "signature" whenever I write about Mac OS, since I always get this response.
I know Linux exists. In fact, I've been using it as my primary OS roughly from 1994 to 2006, and since then intermittently for some tasks, or as a main development machine for a couple of years. I wrote device drivers for Linux and helped with testing the early suspend/hibernate implementations. I'm all in support of Linux.
But when I need to get work done, I do it on MacOS, because it mostly works and I don't have to spend time on dealing with font size issues, window placement annoyances, GPU driver bugs, and the like. And I get drag&drop that works anywhere. All this makes me more efficient.
But I don't want to turn this discussion into a Linux vs MacOS advocacy thread: that's not what it's about. In fact, if we were to turn back to the main topic (ensh*ttification of everything around us), Linux would be there, too: my Ubuntu already displays ads in apt, as well as pesters me with FUD about security updates that I could get if I only subscribed. This problem is not magically cured by switching to any Linux, it's endemic in the world we live in today.
safety1st
No, it really is cured by switching to Linux, or more precisely to free/libre software. Ubuntu introduced ads, so I switched to Mint. I could do that because the code is all GPL and the ecosystem is large enough that there were sufficient other people with beefs about Ubuntu to do something. The license and the ability of the community to fork are the keys.
Consumer software has gone straight downhill for the last 20 years and while the FOSS alternatives have some rough edges I always at least try them first. The outcome has been that I am shielded from most of the industry's worst excesses. Bad things happen, the world gets worse, and I just read about it, it doesn't affect me. I am more of a radical than the post author, I say in your personal life, roll it all back 100%, return to history, modernity is garbage, computing has taken a wrong turn because we have allowed it to be owned by illegal monopolies and cartels. I do make compromises in the software stack we use for business simply because my employees are not as radical as I am and I need to be able to work with normal humans.
xmprt
Your explanation why Linux isn't the solution is actually a massive pro in favor of Linux. There's nothing special about Ubuntu that's holding you hostage and if you wanted to switch distros, you could do it in an afternoon. Unlike switching from Mac or Windows which would take much longer and would probably never be a 100% migration.
ehnto
It would be nice if we could trust corporations to stay some kind of course and have our best interests at heart, but they don't, and at some point it starts being our own fault if we keep enduring it. It then follows though that once you have full control over your tools, it's our own fault if we choose not to go solve the issues, but that doesn't feel entirely fair.
We can't personally be responsible for everything. So to bring it back home to enshitification, a free market, free from monopolies or duopolies, should be the solution. As one product gets shit, a hole in the market opens up and could be filled. That's not happening though, so what's going wrong? If it could happen anywhere it's Sillicon Valley, so much money and the culture of disruption and innovation, all the right skills are floating in the employment pool. But software keeps getting more and more shit.
vunderba
Two Things:
1. If you were a bit more familiar with Apple history, you'd know that the Mac was actually Steve Job's push to make things more proprietary and locked down, not less. Make of that what you will.
2. If your ideological stance is in opposition to companies like Microsoft/Apple/etc. and you work in the tech industry, the most effective action you can take as an individual is to deny them your labor.
wqaatwt
> make things more proprietary and locked down, not less
Yet he accidentally made OS X considerably more open than its predecessor by [I presume] pure accident?
__MatrixMan__
> There isn't much we can do, as long as everybody tolerates this.
I don't know if this will be effective in any way, but I've decided to start hosting services for my friends and family out of my closet. It seems that money destroys everything it touches, so it feels nice to be doing something for reasons besides money.
My friends and family are not particularly influential people, but I think it'll be nice to create a little pocket of the world who knows what it was like to not be exploited by their tech.
ehnto
There is heaps you can do, but admittedly not all of it will you want to, and not all of the results will be equivalent.
For one though, you can support open source software, especially linux OS's. Similarly, ditch the Bambu. There are countless better and more open printers out there, and you can DIY excellent 3D printers that get great results.
I think that's the point of difference between now and the past, information has spread so far, and people have fought so hard for open source software and hardware, that we actually have a good defence against corporate greed. You accept some compromise and work a little harder for it, but it's really not that bad.
mvdtnz
You cannot simultaneously complain about companies closing systems off and give Apple any credit at all for the past 20 years of operation. They are the absolute worst offender in the industry without exception.
And no, if the axis you are measuring on is openness versus locked down then Microsoft is not worse. You have simply been brainwashed.
canadaduane
See https://media.ccc.de/c/DS2024 for inspiration and ideas.
titzer
This article really resonated with me. Unfortunately I think things aren't going back. What the article doesn't appreciate--and we techies don't either--is just how much the scale of today's tech market absolutely dwarfs the scale of the tech market back in the days before the internet.
The market wanted growth. Early tech companies, like Microsoft, Apple, eBay, and then Google, went from zero to huge in a very short period of time. But companies like the FAANGs kept up the absurd levels of growth (20+% YoY growth in the case of Google) that Wall Street got hooked on, and it's been on a drug binge ever since. The result is that we have multiple trillion dollar companies that will...never not want to be a trillion dollar company.
The total amount of money in the PC market was miniscule compared to today, and the internet and its online retail plus ads bonanza even dwarfed that. The PC software market, the video games industry, everything--it was all so much smaller. As the internet swallowed the world, it brought billions of users. And those billions of users can only use so many devices and so many games and spreadsheets and stuff. They had to be made into cash cows in other ways.
The tech market just has to keep growing. It's stuck tripping forward and must generate revenue somehow to keep the monsters' stomachs fed (and their investors too). We will never be free of their psychotic obsession with monetization.
And advertising is soooo insidious. Everything looks like it's free. But it isn't. Because our eyeballs and our mindshare is for sale. And when they buy our eyeballs their making back those dollars of us--it's the whole point. So whether you like it or not, you're being programmed to spend money in other parts of your life that you wouldn't otherwise. It cannot move any direction but falling forward into more consumerism.
I'm afraid I'm a doomer in this regard. We're never going back to not being bothered to death by these assholes who want to make money off us 24/7.
Nevermark
It is the legal system that hasn't caught up with how tech scales seemingly small damage.
What were small conflicts of interest before (a little trash here or there, a little use of personal information for corporate instead of customer benefit here or there, ...) now scales to billions of people. And dozens of transactions, impressions, actions, points of contact, etc., a day for many of us.
That not only makes it more pervasive, but massively profitable, which has kicked in a feedback loop for sketchy behavior, surveillance, coercion, gatekeeping, etc., driven by hundreds of billions of dollars of revenue and trillions in potential market caps.
Things that were only slightly unethical before, now create vast and growing damage to our physical and mental environments.
It should simply be illegal to use customer information in a way not inherent to the transaction in question. Or to gather data on customers from other sources. Or share any of that data.
It should be illegal, to force third party suppliers to pay a tax to hardware makers, for any transaction that doesn't require their participation. And participation cannot be made mandatory.
Etc.
One commonality here, is that there is often a third party involved. Third party gatekeeper. Third party advertisers. Third parties introduce conflicts. (This is different from non-personalized ads on a site they have relevance for, which are effectively two independent, 2-party transactions.)
Another commonality, is the degree to which many third party actors, those we know, and many we never hear of, who "collude" with respect to dossiers, reaching us, and milking us by many coordinated means.
Animats
> It is the legal system that hasn't caught up with how tech scales seemingly small damage.
Most administrations are squishy-soft on corporate crime. If there were regular antitrust prosecutions, violations of Federal Trade Commission regulations were crimes, wage theft was treated as theft, forging safety certifications was prosecuted as forgery, and federal law on warranties was strictly enforced, most of the problems would go away.
In the 1950s and 1960s, all that was normal. The Americans who lived through WWII were not putting up with that sort of thing.
lazide
The economy was also wildly different back then - there were massive, fundamental, competitive advantages the US was continuing to reap due to being on the winning side of WW2 (in every way).
For instance, nearly every country was paying the US loans back, in USD, or was having to depend on the US in some way.
Nearly every other country in the world had their industrial base (and often male population) crushed in the war.
Etc.
Those things cost money/effort, and require a consistent identity and discipline.
II2II
In some respects, I agree. Yet I don't think we have to put up with it all of the time. Most of the technology in our life is either frivilous or has a workable alternative. It is not as though we have to abandon technology in, or even current technology in pursuit of the personal. Yes, it involves making more careful decisions. Yes, it will likely be limited to people with technical knowledge. On the other hand, that was true of computing in the 1980's and largely true of computing in the 1990's.
In many respects, we are also better off than we were in the 1980's. There are more of us, we are connected globally, and the tools that we have access to are significantly better. We also have a conceptual framework to work within. Technically speaking, Free Software may have existed back then but few people even knew of it. People were struggling with ideas like public domain software (rarely with an understanding of what that meant). If you wanted to make money, outside of traditional publishing channels, you were usually toying with ideas like shareware (where you had pretty much no control over distribution). If you wanted to spend money of software, outside of traditionally published stuff, chances are that you had to send cheques or cash to somebody's house.
And then there is communicating with likeminded people. We may like to complain about things like Discord or Reddit, but they are not the only players on the block. Plenty of people still run small or private forums. Yeah, they can be hard to find. On the other hand, that has more to do with the noise created by the marketplace rather than their lack of presence.
everdrive
>There are more of us, we are connected globally,
Why is this good?
noobermin
The problem with the nimby/ecofascist/exclusionary perspectives is the obvious retort is always "okay, yes there are too many people in this domain. The solution then is for you to quit, not me." And substitute whichever group doesn't encompass you which usually falls along racial, gender, or class lines. At the end of it, no one wants to fall on their sword for everyone else.
The thing is the older I get, the more it does seem like at the very least we are not growing pie in a number of areas (the example at the top of my mind is academia) and sometimes it just seems like an easier solution is to decrease the numerator. But I don't know how you can do that and justify it morally, both to society and to yourself.
llm_trw
It's time we give up on the majority of people who don't care for freedom and focus on the few that do.
Unfortunately at the time we need them the most pretty much every pro-user organization is imploding because everyone and their grandmother wants to turn them into vehicles for whatever their pet cause is.
BrenBarn
Also, even if they're not, they're getting squeezed out. It's hard to stay afloat trying to just do a thing without your eye on the "prize" of getting bought out by Google et al.
protocolture
I mean, the solution is inside your definition of the problem. Infinite capital growth isn't possible. They will either finally make their products unusable or collapse. When they have collapsed enough and we have reached the plateau of innovation someone will make some basic device interoperable with everything and leave us be to count their millions instead of billions.
Its just another bubble, one predicated on mining the users rather than expanding the product.
coldtea
>What the article doesn't appreciate--and we techies don't either--is just how much the scale of today's tech market absolutely dwarfs the scale of the tech market back in the days before the internet.
I understand it and know it. But I don't appreciate it either (in the sense of liking it).
dangus
I think it's easy to forget that computing technology is a tool. Of course it was bound to be huge today, because it's supposed to be a tool in the toolbox of every company. It wasn't as big back then because not every industry could incorporate it right away, knew how to, or was interested in doing so.
It's not bad that it's big. It only needs to grow because the rest of the economy needs to grow.
I am also afraid you're a doomer in this regard. You don't think the bigwigs with their fax machines in the 1980s wanted to make money off of us 24/7? Of course they did.
Tech is scary in the sense that it's now gone quite a bit beyond the understanding of the average joe. Even most of us on this site probably don't fully understand how much detail data can paint a picture of a person. There are companies that probably know something about me that I don't even know.
I guess I don't know how to alleviate that feeling, and maybe it's the correct default assumption to be a doomer. It certainly would be very helpful if the US treated the situation more like the EU treats the situation.
sirlone
[dead]
spencerflem
This is part of why I've been so excited about Genode/Sculpt https://genode.org/documentation/articles/sculpt-24-10
It's tiny, clearly built with love for the user, doesn't do a heck of a lot, and has some interesting ideas that are just fun to mess around in. And unlike some of the similar retrocomputing OS's (which are also lovely but grounded in old fashioned design), genode feels like a glimpse into the good future.
abrookewood
That looks like the most radical/unusual operating system thing I have seen in recent memory. Not sure how practical it is, but kudos for trying something so different.
spencerflem
It's so cool, I could talk about it forever. It's practical enough for the devs to use it as a daily driver (though with linux in VirtualBox or Seoul for some things like running their builds) and theres a few businesses built on it.
But nowhere near as practical as Linux at the moment of course
tombert
Interesting, I didn't know anyone had tried to make seL4 on a desktop.
I think it'd be very cool to have a fully verified kernel...
portercable
I had not heard of Genode/Sculpt, but it looks interesting. These days, I feel like if I boot a new operating system, I have no idea what all it's doing and whether or not things are secure--I'm basically relying on the operating system to have good defaults. And then it's so easy to screw something up!
I like the idea of Qubes and it looks like Genode might be an even better idea...
spencerflem
It's a very similar philosophy to Qubes - one of their open challenges is to port the qubes infrastructure over since qubes is (in theory at least) hypervisor independent. https://genode.org/about/challenges Which would be nice since NOVA hypervisor is dramatically less code then Xen and Nitpicker/Dialog for the management console is dramatically less code than Fedora.
I've looked into it briefly but it seems like too much work for me right now.
The True Genode Way of course is that everything worth having would eventually be ported as a native genode component instead of a qubes style VM. They've put a lot of effort into making that as easy as they can with Goa (a nix-inspired package management and build tool) and adding to their C standard library and ports of popular 3rd party libs like SDL
spencerflem
Also - their defaults are pretty hilarious.
They dont assume you want a RAM-Only filesystem. By default it starts out completely immutable with nothing being able to save anything anywhere.
If you want to save anything to a hard drive you have to enable that driver because they don't assume that you'd need one.
Copy and paste is an optional extra to install
It's wild :p
latentcall
Wow, this looks really cool. How does it handle Atheros WiFi cards? I have a ThinkPad X200 I’d love to throw this on for fun. Thanks for sharing!!
spencerflem
Not sure! They have a system set up for porting drivers from Linux into userspace components so it bats above its size.
From their description: "It is tested best on laptops of the Lenovo X and T series (X220, X250, X260, T430, T460, T470, T490)", 200 isn't on the list but you'd probably have about as good a time as you can
musicale
> How many Nintendo Entertainment System games sustained themselves with in-app purchases and microtransactions? What more did the console ask of you after you bought a cartridge? Maybe to buy another one later if it was fun?
True, but unlike the Apple II, the NES was not an open system. The NES had hardware DRM, which allowed Nintendo to control what games were published for the system and to charge licensing fees (much as Nintendo, or Apple, do today). Nintendo also tried (unsuccessfully) to shut down Game Genie.
steve_taylor
If you want to cheat in 2025, you buy a bag of virtual coins and spend those coins on boosters, extra turns, etc.
If you wanted to cheat in 1992, you'd call the Sega Hotline on a premium phone number and they'd give you cheat codes.
It's the same thing, just a different medium and middleman.
TapamN
In 1992, you more options. Your friends could tell you for free, you could stumble on them yourself, or you could get them with a magazine or book (which you didn't necessarily have to buy, you could just flip through it at the store and memorize the cheats.)
Gormo
Don't forget about dialing into the local BBS and trying to find cheats and tricks in text files.
wkat4242
In those days cheating didn't impact other players. This is why pay to win is a bigger problem now.
bityard
> you'd call the Sega Hotline on a premium phone number
I remember the ads for that but I've never met a single person who did that. (Or whose parents would be okay with it.) Cheat codes were either shared by word of mouth among friends or in magazines. Or you bought a game genie, but that was more for messing around with a game's mechanics than actual, blatant cheating.
interludead
In 2025, the "cheating" has become a business model
sharpshadow
It reads a bit romantic leaving out geopolitical interests and seeing money as the solely motivator.
xnx
There's plenty of Ed Zitron's opinions I don't agree with, but this is a really good quote:
"Our economy isn’t one that produces things to be used, but things that increase usage."
api
That's a side effect of the way we've educated the market to expect everything to be "free." That leaves the only option available being indirect monetization through ads or in-app purchases or something similar to that.
xnx
True. I hope the pendulum can swing back the other way if services push too hard.
inetknght
Once upon a time, it was illegal to discount something to gain market share and then charge extra once you've bullied out your competition. Technically it's still illegal, but good luck finding enforcement.
We're seeing the "free" version of that.
api
This is called dumping and yes it was and maybe still is illegal with things like commodities and manufactured goods.
It was never enforced with software or services. If it had the entire standard VC startup playbook would be different.
It’s also never been enforced internationally. China has arguably been subsidizing its industries and effectively dumping cheap manufactured goods for years to become the workshop of the world, and it works.
maiar
And usage tends to go two ways.
x-complexity
> "Our economy isn’t one that produces things to be used, but things that increase usage."
...the quote, *AS A SOUNDBITE*, only sounds good on a surface level, but collapses under the slightest test. All products in some form or another increase the usage of resources in order to reach a certain goal.
https://www.wheresyoured.at/the-anti-economy/
The article, where the quote originates from, contextualizes the quote marking (a) the difference between products in service of an actual goal, (b) products that are only meant to look good on a balance sheet, and (c) how companies have morphed towards (b) in order to attract investor funds and increase share prices / company market values.
The quote, BY ITSELF AND WITHOUT CONTEXT, is a twisted Neo-luddist version of its original self.
musicale
I think it means increase usage of the thing itself, and I think it's a good insight. While there is a natural supply and demand curve, unscrupulous growth-focused businesses optimize their products (unhealthy food) and services (gambling, social media, mobile games) for high levels of consumption (at least for a portion of vulnerable users), irrespective of harmful effects. It's the tobacco industry model reborn.
johnnyanmac
I think a more generous interpretation of this is simply one that is critiquing planned obsolescence and addicting algorithm. Some things need to increase usage by nature, but how many services have you used that really needed a subscription as a necessary model to work?
XorNot
My hot take is planned obsolescence doesn't exist.
It's a side effect of items being built to cost, and the marketing phenomenon that consumers follow fashion trends.
Your car doesn't have planned obsolescence: it has a warranty period. If you want a longer one, you'll pay more because that is not a free service to provide.
8bitsrule
This is such a right write on the subject that it's already a classic manifesto.
tylerflick
I would argue that computing has never been more personal if you’re willing to put in a little effort. The advent of containerization, miniaturization of PC’s, and overall drop in cost of technology has allowed anyone to run there own personal intranet, homelab, whatever.
wvenable
If you want to run your own little silo completely disconnected from your fellow human beings then it has never been easier. But that was never really all difficult in the first place. I don't think it's truly the problem that needs to be solved.
jazzyjackson
Buy-in from the community is indeed the hard part but I have friends running irc and phpbb we hang out in, and matrix is more or less viable to self host for a group chat, it’s just that 100x more people are using Discord and Signal because of network effects, your one account can give you access to a million communities.
I guess activitypub and matrix are meant to be similar in that regard but for whatever reason the learning curve is just a little steeper, so you have to be motivated by ideology to put up with the gaps in usability
BrenBarn
Matrix has some promise, but it's also essentially VC-funded despite the way they try to present it.
spencerflem
I really want to love matrix but as it stands right now discord is a lot more usable
While im willing to out up with it, its a hard sell to get your friends to use something worse
johnnyanmac
Yeah, that network effect is always the hard part once you want/need to reach out past your personal circle. Even with a personal circle it can ha hard to make people use a better but smaller service.
nicce
Sadly only for the people in very techinal field. Most of the common consumer products are impossible to use local-only.
interludead
Yep! And opting out is either prohibitively difficult or outright impossible for most people
spencerflem
I mean, the building blocks are there, but so much has moved into "the cloud". You can't run Just Photoshop anymore, you can only run CC that sends all your images to them, you can't run Word without running Office 365, you cant run most games without Steam, etc etc. And all of the exciting new software is -As-A-Service
So while there's more options now for homelab things the overall ecosystem has moved strongly away so there's a lot more to avoid
keyringlight
One of the aspects I've wondered about is the software bundled with either the OS or by the device manufacturer. When PCs (broadly, not just IBM compatibles) were penetrating into the home market they would often come with a suite of tools or demos to show what it could do, or let you create things even if they were the basic editions. Before the internet became part of the furniture, if you'd spent several hundreds on some hot technology there was a good chance you'd buy print magazines for it as well, and they would come with cover discs that exposed people to a lot of what was possible with computing.
Without wanting to sound like a stick in the mud, the focus of computing has definitely changed now. I see it as an interesting thought exercise on how to get someone running around with what is usually a marvel of computing in their pocket to try and imagine that is not the apex of computing, whether to explore what other means of computing offer or what comes next besides a slightly better version of what we have now.
BrenBarn
> I see it as an interesting thought exercise on how to get someone running around with what is usually a marvel of computing in their pocket to try and imagine that is not the apex of computing, whether to explore what other means of computing offer or what comes next besides a slightly better version of what we have now.
That is a great way of thinking about it and I'm curious what you've come up with. I think it's a pretty hard sell for most people, especially for things like messaging that have become very central to daily life. Also, there's a big difference between convincing someone to try something a bit less mainstream and convincing them to reject the mainstream version. Like, you may be able to get someone to install LibreOffice but it's a lot harder to get them to uninstall Excel.
Anecdotally, I've found that people who have some other kind of retro/niche/subculture interest can be somewhat more receptive to the idea that the newest thing isn't necessarily the greatest. Like someone who's into hunting for vintage clothes, or woodworking, or whatever. Ironically such people are on average more tech-averse than a typical "normie", but they often understand the concept that it can be useful to actually put effort into getting something that's not just whatever's handed to you. In a way the insidious aspect of recent tech is the way it's conditioned people to expect that they shouldn't have to think much about how to do things, and to just want "smart" technology that reduces decisions.
cynicalsecurity
Gimp, LibreOffice, Gog.
spencerflem
I love free software, but gimp is not as good as 2006 photoshop and libreoffice is not as good as 2007 word.
Gog & itch & humble are great and as good as steam if they have what you want but the collection is a lot smaller
null
DerekL
What's this about Photoshop? As far as I can tell, using cloud storage is an option in Photoshop, it's not required.
MiddleEndian
Aside from the Cloud bullshit, Photoshop's auto-update is a pain. After some regression I disabled updates right after they fixed it. But recently when I open Photoshop, it's started giving me a nagging popup about being out-of-date. It's done more-or-less the same thing for many years, just take my subscription money and leave me alone!
spencerflem
The only way you can buy it is with cloud included as a subscription
nfw2
people would rather complain than put in a little effort, especially if it means losing the chance to invent an evil empire to rebel against
api
Your definition of "anyone" is pretty skewed toward the tech-savvy.
Spend some time in the tech support desk of a mobile phone store to get an idea of the general level of technical sophistication of the average person. Average folks are not running containers. They're not installing... anything... except maybe an app from an App Store. Half of them aren't sure what a file is.
dragonmost
But the hardware availability and affordability gives them the opportunity to learn and experiment if they want to. Even tech-savvy people couldn't do that a decade or 2 ago. Not on a budget.
BrenBarn
It's a nice article, but like so many I feel like it has a reluctance to address some of the issues head-on. Like this:
> I’m not calling the tech industry evil.
Well. . . why not? I think at this point the tech industry is evil. Not in the sense that water is wet, and maybe not even in the sense that mammals birth live young, but sort of in the sense that ice occurs near the Earth's poles. There are some bits and pieces here and there that don't follow the pattern but they are the exception and they're getting smaller.
That doesn't mean that technology is evil, but the ways its being used and developed often are.
And that gets to another aspect of this that I feel like people in the tech world sometimes overlook when talking about this: enshittification is not a technological phenomenon. It's a social phenomenon that is driven by our socio-legal apparatus which allows and encourages high-risk pursuit of sky-high profits. Corporate raiding a la the Sears debacle, consolidation in grocery stores, even something like tobacco/health or oil/climate-change coverups, all these are forms of enshittification. Tech is the most prominent venue, maybe because it's especially suited to certain forms of vendor lock, but it's by no means the only one.
Enshittification happens because we are not willing to take a sledgehammer to the idea that making businesses bigger and bigger is a good thing. Timid reforms focused on things like data privacy are woefully inadequate. Large companies need to be directly dismantled, wealth needs to be directly deconcentrated, and broad swaths of behavior that are currently happening left and right need to become the kind of thing that will land you in prison for 10 years.
I'm not optimistic this is going to happen without some kind of "hitting bottom", though, whatever form that may take.
amilios
Maybe I'm too cynical, but too many people in power directly benefit fom enshittification for anything about it to change. Even just the problem of fixing the housing market while the majority of politicians own several properties is an example of this. There's zero incentive for anything to change.
brandon272
I love the "which part of.." examples of companies and services that the author lists, along with the screenshots. I know that nostalgic feelings tend to not be an accurate representation of the past, but I do know that I used to look at a lot of those companies and products with some admiration. No, things were not perfect back then, but a lot of these products had a level of innocence, goodwill or benevolence that does not exist today. They seemed more rooted in innovation than value extraction at all costs.
Today, I look at those same companies with absolute derision over their completely unethical and hostile approaches to the world, the economy and dealing with the people that use or rely on them.
Worse, my ability to get excited about new companies, products, services and innovations has been completely blunted by the expectation that anyone working on something I think is "cool" will inevitably be co-opted by people who have the worst instincts: those who actually have no respect for technology or computing and view people as less than human, simply entities from which maximum value must be extracted at any cost.
interludead
Maybe we should reconciling the best of the past with the benefits of today...
BrenBarn
How?
talles
I 100% want the shift towards this as probably everyone in this comment section right here.
But how do we sell to the layman that he is missing something, which he never experienced in the first place? Sadly, I believe we are doomed to be niche.
I expect these comments to be full of agreement. Corporate behavior in the computer space leaves much to he desired.
I will however observe;
None of the supplied examples showed any form of network effect. It was all stuff you did at home.
Today, there are certainly options for personal computing for most everything- as long as network effects are not in play.
Those options may not be as convenient, as cheap, or as feature-rich as the invasive option. That's fair though - you decide what you want to prioritize.
Network effects are harder to deal with. To the extent that in order to be in community you need to adopt the software the community has chosen.
Not surprisingly, software producers that can build-in network effects, do so. It's excellent from a lock-in point of view.
The title of the article is perhaps then ironic. It's trivial to make computing personal. All the tools to do so already exist.
The issue is not Personal Computing. It's Community Computing.