Skip to content(if available)orjump to list(if available)

Why Local-First Apps Haven't Become Popular?

lordnacho

Local-first was the first kind of app. Way up into the 2000s, you'd use your local excel/word/etc, and the sync mechanism was calling your file annual_accounts_final_v3_amend_v5_final(3).xls

But also nowadays you want to have information from other computers. Everything from shared calendars to the weather, or a social media entry. There's so much more you can do with internet access, you need to be able to access remote data.

There's no easy way to keep sync, either. Look at CAP theorem. You can decide which leg you can do without, but you can't solve the distributed computing "problem". Best is just be aware of what tradeoff you're making.

magicalhippo

> There's no easy way to keep sync, either. Look at CAP theorem.

Sure there is, you just gotta exploit the multiverse[1]. Keep all the changes in their own branch aka timeline, and when there's some perceived conflict you just say "well in the timeline I'm from, the meeting was moved to 4pm".

[1]: https://www.reddit.com/r/marvelstudios/comments/upgsuk/expla...

card_zero

> nowadays you want to have information from other computers.

Do I? What sort of information ...

> shared calendars

OK yes that would be a valid use, I can imagine some stressed executive with no signal in a tunnel wanting to change some planned event, but also to have the change superceded by an edit somebody else makes a few minutes later.

> the weather

But I don't usually edit the weather forecast.

> a social media entry

So ... OK ... because it's important that my selfie taken in a wilderness gets the timestamp of when I offline-pretend-posted it, instead of when I'm actually online and can see replies? Why is that? Or is the idea that I should reply to people offline while pretending that they can see, and then much later when my comments actually arrive they're backdated as if they'd been there all along?

raincole

> offline-pretend-posted

It's a far, far more complicated mental model than simply posting it. It'd be a huge barrier for normal users (even tech-savvy users, I'd say). People want to post it online and that's it. No one wants an app what requires its users to be aware of syncing state constantly unless they really have no choice. We pretend we can step on gas instead of mixing the gas with air and ignite it with a spark plug until we need to change the damn plug.

dundarious

Email clients had the outbox that was local only and then you pushed to send them all. Hiding the outbox is why some of these things seem fiddly to use, despite being conceptually very simple. This model would seem to work very well at least for non-collaborative changes like IG posts.

bluGill

Nearly everything I do is on a shared computer.

At work: I write code, which is in version control. I write design documents (that nobody reads), and put them on a shared computer. I write presentations (you would better off sleeping through them...) and put them on a share computer. Often the above are edited by others.

Even at home, my grocery list is shared with my wife. I look up recipes online from a shared computer. My music (that I ripped from CDs) is shared with everyone else in the house. When I play a game I wish my saved games were shared with other game systems (I haven't had time since I had kids, more than 10 years ago). When I take notes about my kid's music lessons they are shared with my wife and kids...

tetralobita

in our company some team is trying to solve offline selling items dropping from stock, when become offline it syncs, there are price and stock changes to be sync.

marginalia_nu

> There's no easy way to keep sync, either. Look at CAP theorem. You can decide which leg you can do without, but you can't solve the distributed computing "problem". Best is just be aware of what tradeoff you're making.

Git has largely solved asynchronous decentralized collaboration, but it requires file formats that are ideally as human understandable as machine-readable, or at least diffable/mergable in a way where both humans and machines can understand the process and results.

Admittedly git's ergonomics aren't the best or most user friendly, but it at least shows a different approach to this that undeniably works.

taeric

Git does no such thing. Plain text files with free form merging capabilities somewhat solves the idea that you can merge things. But, to make that work, the heavy lifting has to be done by the users of the system.

So, sure, if you are saying "people trained to use git" there, I agree. And you wind up having all sorts of implicit rules and guidelines that you follow to make it more manageable.

This is a lot like saying roads have solved how to get people using dangerous equipment on a regular basis without killing everyone. Only true if you train the drivers on the rules of the road. And there are many rules that people wind up internalizing as they get older and more experienced.

jordanb

I feel like git set back mainstream acceptance of copy-and-merge workflows possibly forever.

The merge workflow is not inherently complicated or convoluted. It's just that git is.

When dvcses came out there were three contendors: darcs, mercurial and git.

I evaluated all three and found darcs was the most intuitive but it was very slow. Git was a confused mess, and hg was a great compromise between fast and having a simple and intuitive merge model.

I became a big hg advocate but I eventually lost that battle and had to become a git expert. I spent a few years being the guy who could untangle the mess when a junior messed up a rebase merge then did a push --force to upstream.

Now I think I'm too git-brained to think about the problem with a clear head anymore, but I think it's a failure mostly attributable to git that dvcs has never found any uptake outside of software development and the fact that we as developers see dvcs as a "solved problem" outside more tooling around git is a failure of imagination.

marginalia_nu

Yeah I mostly agree with this. I'm mostly talking about git the model, rather than git the tool when I say git has solved the problem of asynchronous decentralized collaboration.

For local-first async collaboration on something that isn't software development, you'd likely want something that is a lot more polished, and has a much more streamlined feature set. I think ultimately very few of git's chafing points are due to its model of async decentralized collaboration.

robenkleene

> The merge workflow is not inherently complicated or convoluted. It's just that git is.

What makes merging in git complicated? And what's better about darcs and mercurial?

(PS Not disagreeing just curious, I've worked in Mercurial and git and personally I've never noticed a difference, but that doesn't mean there isn't one.)

TylerE

That git won over hg is a true tragedy. The hg ux/ui is so much better.

jjcob

Git works, but it leaves conflict resolution up to the user. It's good for a tool for professional users, but I don't see it being adopted for mainstream use.

PaulHoule

The funny thing about it is I see git being used in enterprise situation for non-dev users to manage files, often with a web back end. For instance you can tell the average person to try editing a file with the web interface in git and they're likely to succeed.

People say git is too "complex" or "complicated" but I never saw end users succeeding with CVS or Mercurial or SVN or Visual Sourcesafe the way they do with Git.

"Enterprise" tools (such as business rules engines) frequently prove themselves "not ready for the enterprise" because they don't have proper answers to version control, something essential when you have more than one person working on something. People say "do you really need (the index)" or other things git has but git seemed to get over the Ashby's law threshold and have enough internal complexity to confront the essential complexity of enterprise version control.

criddell

How can you avoid leaving conflict resolution up to the user?

robenkleene

The problem with "human understandable" with respect to resolving syncing conflicts, is that's not an achievable goal for anything that's not text first. E.g., visual and audio content will never fit well into that model.

marginalia_nu

I can undo and redo edits in these mediums. Why can't these edits be saved and reapplied?

Not saying this would be in any way easy, but I'm also not seeing any inherent obstacles.

JustExAWS

Git hasn’t solved it in a way that any normal person would want to deal with.

recursivedoubts

“solved”

imagine asking a normie to deal with a merge conflict

marginalia_nu

That's an UX issue with git, not really what's being discussed.

poszlem

Git solved this by pushing the syncing burden onto people. It’s no surprise, merge conflicts are famously tricky and always cause headaches. But for apps, syncing really ought to be handled by the machine.

marginalia_nu

If you want local-first, conflict resolution is something you're unlikely to be able to avoid. The other option is to say "whoops" and arbitrarily throw away a change when there is a conflict due to a spotty wifi or some such.

Fortunately, a lot of what chafes with git are UX issues more than anything else. Its abstractions are leaky, and its default settings are outright bad. It's very much a tool built by and for kernel developers with all that entails.

The principle itself has a lot of redeemable qualities, and could be applied to other similar syncing problems without most of the sharp edges that come with the particular implementation seen in git.

null

[deleted]

guywithahat

I also suspect it's more portable. You build one site with one API, and then you just interact with that api across all the devices you support. If you write it locally it has to get rewritten for each platform

bryanlarsen

There's no perfect general solution, but the number of conflicts generated by small teams collaborating in an an environment where the internet almost always works is going to be miniscule. No need to let the perfect be the enemy of the good.

tcoff91

CRDTs are a pretty good experience for many data types when it comes to collaborative editing.

paldepind2

> Local-first was the first kind of app. Way up into the 2000s, you'd use your local excel/word/etc, and the sync mechanism was calling your file annual_accounts_final_v3_amend_v5_final(3).xls

To be precise, these apps where not local-_first_, they where local-_only_. Local-first implies that the app first and foremost works locally, but also that it, secondly, is capable of working online and non-locally (usually with some syncing mechanism).

Sharlin

Well, the first kind of PC app, anyway. For decades before that, programs were run on time-sharing mainframes via remote terminals.

VikingCoder

I want many more Local-Only apps, thanks. Self-Hosted.

Or Federated apps, again Self-Hosted.

And I think network infrastructure has been holding us back horribly. I think with something like Tailscale, we can make local-only apps or federated apps way, way easier to write.

hasanhaja

I've been having fun exploring this actually: https://news.ycombinator.com/item?id=45333494

I've found it to be a fun way to build apps.

gritzko

The author’s journey is probably just starting. I had this exact mindset about 10 years ago. Long story short: distributed systems are hard. A linear log of changes is an absolute lie, but it is a lie easy to believe in.

aleph_minus_one

> distributed systems are hard.

While this may be true, the central issue is a different one: most users and/or developers are not very privacy-conscious, so they don't consider it to be worth the effort to solve the problems that go in hand with such distributed systems.

null

[deleted]

MangoToupe

> A linear log of changes is an absolute lie

Compared to what?

ainiriand

A graph of changes in a distributed system, I assume.

MangoToupe

...how is that any less of a lie? You can project a linear log into a graph trivially.

xorvoid

I believe the lack of popularity is more of an economics problem. There are established business models for SaaS apps or freemium with ads. But, the business model for local-first apps is not as lucrative. Those who like the local-first model value features like: data-sovereignty, end-to-end encryption, offline usage, etc. These properties make existing business models hard-to-impossible to apply.

My current thinking is that the only way we get substantial local-first software is if it's built by a passionate open-source community.

godshatter

It's crazy that we live in a time when "pay with money and your data" or "pay with your eyeballs" are the only viable options and "pay with your money without your data" can't even be considered.

chaostheory

Someone has to prove that there’s a demand for paid local first subscriptions. Open source and tailscale can’t shoulder it all if you want more adoption.

canpan

Yes, I don't think replicated data structures are the problem.

Look at single player video games, cannot get more ideal for local-first. Still you need a launcher and internet connection.

SilverbeardUnix

No you don't. There are plenty of games you can buy and go into the wilderness and play just fine offline. Just because game developers WANT you to be online so they can get data doesn't mean you NEED to be online.

Reubachi

Your point is correct, but OP is correct.

There are currently tens of thousands of games that are unplayable due to requiring pinging to a network/patch server which long ago was deprecated.

Forsaking patch requirements, just as many games are no longer playable due to incompatibility/abandoned OS, codebase, gamebreaking bugs.

In both of these scenarios, my "lifetime license" is no longer usable through no action of my own, and breaks the lifetime license agreement. I shouldn't need to be into IT to understand how to keep a game I bought 5 years ago playable.

The solution to this "problem" for user, as offered by the corporate investment firms in control, is to offer rolling subscriptions that "keep your license alive", for some reason. Rather than properly charge for a service at time of purchase.

TLDR: Why move the goal posts further in favor of tech/IT/Videogame Investment firms?

antonvs

This is the primary reason. The heaviest push for SaaS came from Silicon Valley, which wanted a recurring revenue stream model.

MathMonkeyMan

The free software evangelist in me says "because local-first gives more to the user of the software," which will tend not to happen when the user is not in control of the software.

Realistically the reason is probably that it's easier to make changes if you assume everything is phoning home to the mother ship for everything.

Also, an unrelated nit: "Why Local-First Apps Haven’t Become Popular?" is not a question. "Why Local-First Apps Haven’t Become Popular" is a noun phrase, and "Why Haven't Local-First Apps Become Popular?" is a question. You wouldn't say "How to ask question?" but instead "How do you ask a question?"

robenkleene

Apple is practically the most antithetical to "free software" company around, yet Apple maintains perhaps the largest fleet of local-first apps in existence, e.g., off the top of my head: Calendar, Contacts, Keynote, Mail, Notes, Numbers, Photos, and Pages (these are all examples of apps that support multi-device sync and/or real-time collaboration).

I think the truth of your statement is more that free software tends towards what you might call "offline" software (e.g., software that doesn't sync or offer real-time collaboration), because there's more friction for having a syncing backend with free software.

russnewcomer

I'm not sure CRDTs are actually the right answer here for your example of #2, Marco. A double-entry accounting system might actually be more ideal. In that case, what you are keeping in sync is the ledger, but depending on your use-case, that might actually be easier since you can treat them as a stream-of-data, and you would get the 'correct' answer of 100.

In this case, you would need two accounts, a credit and debit account, and then device A would write +20 to the credit account and -20 to the debit account, device B would write -20 to the credit account and +20 to the debit account, then using a HLC (or even not, depending on what your use-case is again), you get back to the 100 that seems from the description of the problem that it is the correct answer.

Obviously, if you are editing texts there are very different needs, but this as described is right in the wheelhouse of double-entry accounting.

0xffff2

I feel like I'm taking crazy pills. How on Earth could anyone consider the example in #2 "conflict-free"? You haven't removed the conflict, you're just ignored it! Anything can be conflict free in that case.

Obviously not every problem will have such an obvious right answer, but given the example the author chose, I don't see how you could accept any solution that doesn't produce "100" as a correct result.

andsoitis

Lotus Notes and later Groove Networks (both brought to us courtesy Ray Ozzie) both provided a platform to create apps with data synchronizing as a first-class citizen.

The technology behind Groove now powers OneDrive and Microsoft 365.

Notes: https://en.wikipedia.org/wiki/HCL_Notes

Groove: https://en.wikipedia.org/wiki/Groove_Networks

sylens

There's no money in making a local first app. Businesses want your data, they want you to be dependent on them, and they want to be able to monetize your behavior and attention

zsoltkacsandi

There is money in local first apps, businesses are just greedy.

PaulHoule

Lotus Notes solved syncing for object databases but the world forgot

https://en.wikipedia.org/wiki/HCL_Notes

patwolf

From my time using Notes I remember lots of manual replication config to get anything to properly work offline, and even then I struggled to get it to work reliably. So while they might have solved it, I don't think their solution was very good.

codegeek

You just reminded me of the nightmare Lotus Notes was. Great idea but absolutely horrendous implementation. Probably the worst piece of software I have ever used and I have been in the industry for 21+ years now.

mschuster91

Ahhh Lotus Notes... in many ways ahead of its time, timeless and horribly outdated at the same time.

echelon

I still remember the crazy password screen with the symbols that changed as you typed.

If that was deterministic, that was a very bad idea.

AlexandrB

IIRC the symbols were basically a hash of the password that let you know if you typed the password correctly without showing it.

czx111331

I believe that in the right contexts—specifically where eventual consistency is acceptable—the local-first paradigm is highly valuable and will gradually become mainstream. A major factor limiting adoption today is that existing local-first solutions are incomplete: when building such applications, developers must handle many problems that are trivial under strong-consistency or traditional models. This raises the learning cost and creates significant friction for paradigm shifts.

Our recent work on Loro CRDTs aims to bridge this gap by combining them with common UI state patterns. In React, developers can keep using `setState` as usual, while we automatically compute diffs and apply them to CRDTs; updates from CRDTs are then incrementally synced back into UI state [1]. This lets developers follow their existing habits without worrying about consistency between UI state and CRDT state. Paired with the synchronization protocol and hosted sync service, collaboration can feel as smooth as working with a purely local app. We’ve built a simple, account-free collaborative example app[2]. It only has a small amount of code related to synchronization; the rest looks almost the same as a purely local React app.

[1]: https://loro.dev/blog/loro-mirror

[2]: https://github.com/loro-dev/loro-todo

poisonborz

The goals have shifted. What is "local" nowadays? ~Everybody uses multiple devices, and things are expected to be in sync. What users now (should) need is easy self-hosting apps and centralised storage, with clients that can work/cache offline locally. A good example is Bitwarden.