Skip to content(if available)orjump to list(if available)

Should we design for iffy internet?

Should we design for iffy internet?

111 comments

·June 17, 2025

bob1029

If you really want to engineer web products for users at the edge of the abyss, the most robust experiences are going to be SSR pages that are delivered in a single response with all required assets inlined.

Client-side rendering with piecemeal API calls is definitely not the solution if you are having trouble getting packets from A to B. The more you spread the information across different requests, the more likely you are going to get lose packets, force arbitrary retries and otherwise jank up the UI.

From the perspective of the server, you could install some request timing middleware to detect that a client is in a really bad situation and actually do something about it. Perhaps a compromise could be to have the happy path as a websocketed react experience that falls back to a ultralight, one-shot SSR experience if the session gets flagged as having a bad connection.

DannyPage

A big focus is (rightly) on rural areas, but mobile internet packet loss can also a big issue in cities or places where there are a lot of users. It's very frustrating to be technically online, but effectively offline. An example: Using Spotify on a subway works terribly until you go into Airplane mode, and then it suddenly works correctly with your offline music.

epistasis

When Apple did their disastrous Apple Music transition, I was in the habit of daily recreation that involved driving in areas without mobile access.

All of a sudden one day, I was cut off from all my music, by the creators of the iPod!

I switched away from Apple Music and will never return. 15 years of extensive usage of iTunes, and now I will never trust Apple with my music needs again. I'm sure they don't care, or consider the move a good tradeoff for their user base, but it's the most user hostile thing I've ever experienced in two decades on Apple platforms.

w10-1

Forget internet: just sync.

Add music on macOS, and on your phone. Then sync.

RESULT: one overwrites the other, regardless of any settings.

You no longer have the audio you formerly owned.

simonw

Did the "download" option in Apple Music not work? Or was that not available when they first launched the new app?

amendegree

The OP already had the music downloaded to his device. When apple switched to the streaming service they deleted all that… you still technically owned the music, but now it had to be streamed. I also don’t recall if they started with an offline feature.

epistasis

There was a random smattering of songs from my library on my device, but not according to anything I regularly listened to.

I couldn't be bothered to spend time manually selecting stuff to download back then. It was offensive to even ask that spend 30 minutes manually correcting a completely unnecessary mistake on their part. And this was during a really really bad time in interface, with the flat ui idiocy all the rage, and when people were abandoning all UI standards that gave any affordances at all.

If I'm going to go and correct Apple's mistake, I may as well switch to another vendor and do it. Which is what I did. I'm now on Spotify to this day, even though it has many of the problems as Appple Music. At least Spotify had fewer bugs at the time, and they hadn't deleted music off my device.

Good riddance and I'll never go back to Apple Music.

joshmarinacci

It did have that at launch, but the transition was very confusing. There was (is?) an "iTunes Match" thing to replicate your personal mp3s in the cloud rather than uploading them. It was a real mess.

bityard

Very good point. We had several power outages lasting a few hours lately. (One was just last night.) Every time this happens, my phone's mobile data is totally unusable because the whole neighborhood switches over from scrolling facebook (et all) on their wifi to scrolling facebook on mobile.

I can (and do) find things around the house that don't depend on a screen, but it's annoying to know that I don't really have much of a backup way to access the internet if the power is out for an extended period of time. (Short of plunking down for an inverter generator or UPS I suppose.)

BenjiWiebe

If your ISP is available during a power outage (as they should be) a UPS that only powers a WiFi router could be quite small/cheap.

Or you could use a Raspberry Pi or similar and a USB WiFi adapter (make sure it supports AP mode) and a battery bank, for an "emergency" battery-operated WiFi router that you'd only use during power outages.

hypeatei

I'm on fiber at home and my ISP did a backend update which is dropping packets specifically on IPv6 for some reason. Most sites are unusable and other software isn't handling it very well (e.g. android) with frequent "no internet" popups.

rendaw

Also subways, and people with cheap data plans that get throttled after 1GB. Google maps regularly says "no results found" because the connection times out.

98codes

The only thing worse than no internet is one bar of signal.

deltaburnt

It's deeply ironic how awfully designed the NYT games app is for offline use given many people use it on the subway. Some puzzles will cache, others won't. They only cache after you manually open them.

genocidicbunny

Speaking of Airplanes, I also frequently have issues with apps and websites when using in-flight wifi due to the high latency and packet loss. Incidentally, Spotify is one of said apps, which often means I need to manually set it to offline mode to get it to work.

zeinhajjali

This reminds me of a project I worked on for a grad school data science course here in Canada. We tried to map this "digital divide" using public data.

Turns out, it's really tough to do accurately. The main reason is that the public datasets are a mess. For example, the internet availability data is in neat hexagons, while the census demographic data is in weird, irregular shapes that don't line up. Trying to merge them is a nightmare and you lose a ton of detail.

So our main takeaway, rather than just being a pretty map, was that our public data is too broken to even see the problem clearly.

I wrote up our experience here if anyone's curious: https://zeinh.ca/projects/mapping-digital-divide/

morleytj

Really interesting perspective, thanks for sharing.

I think in so many fields the datasets are by far the highest impact thing someone can work on, even if it seems a bit mundane and boring. Basically every field I've worked in struggles for need of reliable, well maintained and open access data, and when they do get it, it usually sets off a massive amount of related work (Seen this happen in genetics, ML of course once we got ImageNet and also started getting social media text instead of just old newspaper corpuses).

That would definitely be advice I'd give to many people searching for a project in a field -- high quality data is the bedrock infrastructure for basically all projects in academic and corporate research, so if you provide the data, you will have a major impact, pretty much guaranteed.

hardolaf

I'm in the USA with nominally a 1.25 Gb/s down, 50 Mb/s connection from my cable ISP. And you'd think that it would be fast, low latency, and reliable. Well that would be true except my ISP is Xfinity (Comcast). At least 4 times per week, I experience frequent packet loss that works with older web servers but makes most newer TCP based technology just fail. And the connection will randomly fail for 10 minutes to 2 days at a time and sure they give me a credit for it.

So anyways, I bring this up with my local government in Chicago and they recommend that I switch to AT&T Fiber because it's listed as available at my address in the FCC's database. Well, I would love to do that except that

1. The FCC's database was wrong and rejected my corrections multiple times before AT&T finally ran fiber to my building this year (only 7 years after they claimed that it was available in the database despite refusing to connect to the building whenever we tried).

2. Now that it is in the building, their Fiber ISP service can't figure out that my address exists and has existing copper telephone lines run to it by AT&T themselves so their system cannot sell me the service. I've been arguing with them for 3 months on this and have even sent them pictures of their own demarc and the existing copper lines to my unit.

3. Even if they fixed the 1st issue, they coded my address as being on a different street than its mailing address and can't figure out how to sell me a consumer internet plan with this mismatch. They could sell me a business internet plan at 5x the price though.

And that's just my personal issues. And I haven't even touched on how not every cell phone is equally reliable, how the switch to 5G has made many cell phones less reliable compared to 3G and 4G networks, how some people live next to live event venues where they can have great mobile connections 70% of the time but the other 30% of the time it becomes borderline unusable, etc.

HPsquared

Oddly fitting (or perhaps that's double irony) that your "mapping the digital divide" project was derailed by the literal digital mapping division boundaries.

nine_k

At one of my previous jobs, we designed a whole API to be slightly more contrived but requiring only one round-trip for all key data, to address the iffy internet connectivity most of our users had. The frontend also did a lot of background loading to hide the latency when scrolling.

It's really eye-opening to set up something like toxiproxy, configure bandwidth limitations, latency variability, and packet loss in it, and run your app, or your site, or your API endpoints over it. You notice all kinds of UI freezing, lack of placeholders, gratuitously large images, lack of / inadequate configuration of retries, etc.

Tteriffic

Years ago API’s and apps that used them were expected to do some work offline and on slow networks. Then, suddenly, everyone was expected to have stable Internet to do anything. The reason, I think, is the few apps that expected to be always online seemed better to users and easier to architect. So most architectures went that way.

baby_souffle

I wish more developers bothered to test on flaky connections. Absolutely infuriating when an app can't keep up with your muscle memory...

Sanzig

While many websites are bad for large unoptimized payloads sizes, they are even worse for latency sensitivity.

You can easily see this when using WiFi aboard a flight, where latency is around 600 msec at minimum (most airlines use geostationary satellites, NGSO for airline use isn't quite there yet). There is so much stuff that happens serially in back-and-forth client-server communication in modern web apps. The developer sitting in SF with a sub-10 ms latency to their development instance on AWS doesn't notice this, but it's sure as as heck noticeable when the round trip is 60x that. Obviously, some exchanges have to be serial, but there is a lot of room for optimization and batching that just gets left on the floor.

It's really useful to use some sort of network emulation tool like tc-netem as part of basic usability testing. Establish a few baseline cases (slow link, high packet loss, high latency, etc) and see how usable your service is. Fixing it so it's better in these cases will make it better for everyone else too.

catwhatcat

NB modern browsers have a "throttling" dropdown/selector built-in to the dev tools (under 'network') alike tc-netem

HPsquared

Someone needs to package a browser bundled with a variable latency network layer. Maybe a VM?

odo1242

Chrome and Firefox and Safari let you add latency in developer tools

immibis

You can also just live in New Zealand, where your minimum ping time to anywhere relevant is 200-300ms.

GuB-42

The short answer is yes, and there are tools to help you. There are ways to simulate a poor network in the dev tools of major browsers, in the Android emulator, there is "Augmented Traffic Control" by Facebook, "Network Link Conditioner" by Apple and probably many others.

It is telling that tech giants make tools to test their software in poor networking conditions. It may not look like they care, until you try software by those who really don't care.

potatolicious

A good point. The author does briefly address the point of mobile internet but I think it deserves a lot more real estate in any analysis like this. A few more points worth adding:

- Depending on your product or use case, somewhere between a majority and a vast majority of your users will be using your product from a mobile device. Throughput and latency can be extremely high, but also highly variable over time. You might be able to squeeze 30Mbps and 200ms pings for one request and then face 2Mbps and 4000ms pings seconds later.

- WiFi generally sucks for most people. The fact that they have a 100Mbps/20Mbps terrestrial link doesn't mean squat if they're eking out 3Mbps with eye-watering packet loss because they're in their attic office. The vast majority of your users are using wireless links (WiFi or cell) and are not in any way hardlined to the internet.

aidenn0

I don't use an iPhone, but my wife does. She says that it will remove apps from the device that you haven't used in a while, and then automatically re-download when you try to run them. On our WiFi at home, that's fine, but if we are out and about it can take up to an hour to download a single app.

jurip

You can disable that (Settings → Apps → App Store → Offload Unused Apps.)

It's a nice feature, but it would be even nicer if you could pin some apps to prevent their offloading even if you haven't used them in ages.

joshstrange

> but it would be even nicer if you could pin some apps to prevent their offloading even if you haven't used them in ages.

That change would make _viable_ for me at all, right now it's next to useless.

Currently iOS will offload apps that provide widgets (like Widgetsmith) even when I have multiple Widgetsmith widgets on my 1st and 2nd homescreens, I just never open the app (I don't need to, the widgets are all I use). One day the widgets will just be black and clicking on them does nothing. I have to search for Widgetsmith and then make the phone re-download it. So annoying.

Also annoying is you can get push notifications from offloaded apps. Tapping on the notification does _nothing_ no alert, no re-download, just nothing. Again, you have to connect the dots and redownload it yourself.

This "feature" is very badly implemented. If they just allowed me to pin things and added some better UX (and logic for the widget issue) it would be much better.

pimlottc

Definitely, I had this problem on an old iPad where it would often decide to unload my password manager...

pimlottc

Note that this should only happen when you're running low on storage. [0] But yes, it can be very annoying.

0: https://support.apple.com/guide/iphone/manage-storage-on-iph...

aidenn0

I've also noticed that the marginal cost of larger storage on an iPhone is significantly higher than on Android (e.g. my phone was $220 with 256GB of storage; it's $100 per 128GB to upgrade the iPhone 16 storage), making people much more likely to be low on storage.

o11c

This fails to address the main concern I run into in practice: can you recover if some resources timed out while downloading?

This often fails in all sorts of ways:

* The client treats timeout as end-of-file, and thinks the resource is complete even though it isn't. This can be very difficult for the user to fix, except as a side-effect of other breakages.

* The client correctly detects the truncation, but either it or the server are incapable of range-based downloads and try to download the whole thing from scratch, which is likely to eventually fail again unless you're really lucky.

* Various problems with automatic refreshing.

* The client's only (working) option is "full page refresh", and that re-fetches all resources including those that should have been cached.

* There's some kind of evil proxy returning completely bogus content. Thankfully less common on the client end in a modern HTTPS world, but there are several ways this can still happen in various contexts.

1970-01-01

     wget -c https://zigzag.com/file1.zip

     Note that -c only works with FTP servers and with HTTP servers that support the "Range" header.

demosthanos

> This shows pretty much what I'd expect: coverage is fine in and around cities and less great in rural areas. (The Dakotas are an interesting exception; there's a co-op up there that connected a ton of folks with gigabit fiber. Pretty cool!)

Just a warning about the screenshot he's referencing here: the slice of map that he shows is of the western half of the US, which includes a lot of BLM land and other federal property where literally no one lives [0], which makes the map look a lot sparser in rural areas than it is in practice for humans on the ground. If you look instead at the Midwest on this map you'll see pretty decent coverage even in most rural areas.

The weakest coverage for actually-inhabited rural areas seems to be the South and Appalachia.

[0] https://upload.wikimedia.org/wikipedia/commons/0/0f/US_feder...

Workaccount2

This gets down to a fundamental problem that crops up everywhere: How much is x willing to exponentially sacrifice to satisfy the long tail of y?

It's grounds for endless debate because it's inherently a fuzzy answer, and everyone has their own limits. However the outcome naturally becomes an amalgamation of everyone's response. So perhaps a post like this leads to a few more slim websites.

reaperducer

How much is x willing to exponentially sacrifice to satisfy the long tail of y?

Part of the problem is the acceptance of the term "long tail" as normal. It is not. It is a method of marginalizing people.

These are not numbers, these are people. Just because someone is on an older phone or a slower connection does not make them any less of a human being than someone on a new phone with the latest, fastest connection.

You either serve, or you don't. If your business model requires you to ignore 20% of potential customers because they're not on the latest tech, then your business model is broken and you shouldn't be in business.

The whole reason companies are allowed to incorporate is to give them certain legal and financial benefits in exchange for providing benefits (economic and other) to society. If your company can't hold up its end of the bargain, then please do go out of business.

ryandrake

> You either serve, or you don't. If your business model requires you to ignore 20% of potential customers because they're not on the latest tech, then your business model is broken and you shouldn't be in business.

Or, at least the business needs to recognize that their ending support for Y is literally cutting off potential customers, and affirmatively decide that's good for their business. Ask your company's sales team if they'd be willing to answer 10% of their inbound sales calls with "fuck off, customer" and hang up. I don't think any of them would! But these very same companies think nothing of ending support for 'old' phones or supporting only Chrome browser, or programming for a single OS platform, which is effectively doing the same thing: Telling potential customers to fuck off.

sealeck

There's obviously some trade-off here: should your business support fax machines? Work on a Nokia brick? These are clearly impractical. But slow internet connection speeds are a thing that everyone deals with, and therefore it makes sense for your software to be able to handle them.

SoftTalker

Nonsense. There are all kinds of businesses that target specific customer segments, and will even flat out refuse to do business with some others.

jazzyjackson

Article skips consideration for shared wifi such as cafes where, IME, a lot of students do their work. Consumer wifi routers might have a cap of ~24 clients, and kind of rotate which clients they're serving, so not only is your 100Mbit link carved up, but you periodically get kicked off and have to renew your connection. I cringe when I see people trying to use slack or office365 in this environment.

Grateful for the blog w/ nice data tho TY

hamandcheese

I have never experienced this. Then again, I'm not sure if the cafes I frequent have 24+ people connected to wifi at a time.

jonah-archive

A few years ago I was at a cafe in a venue that had a large conference/gathering, and their router was handing out 24 hour DHCP leases and ran out of IP addresses. It was a fairly technical group so me and a couple other people set up a table with RFC 2322-style pieces of paper with IP/gateway info ("please return when finished") and it worked surprisingly well!

myself248

Did you dub yourselves the Impromptu Assigned Numbers Authority?

jmajeremy

I'm a minimalist in this regard, and I really believe that a website should only be as complex as it needs to be. If your website requires fast Internet because it's providing some really amazing service that takes advantage of those speeds, then go for it. If it's just a site to provide basic information but it loads a bunch of high-res images and videos and lengthy javascript/css files, then you should consider trimming the fat and making it smaller. Personally I always test my website on a variety of devices, including an old PC running Windows XP, a Mac from 2011 running High Sierra, an Android phone from 2016, and a Linux machine using Lynx text browser, and I test loading the site on a connection throttled to 128kbps. It doesn't have to run perfectly on all these devices, but my criterion is that it's at least usable.

RajT88

I mean. I prefer my news sites plaintext. I think most video calls should be audio calls, and most audio calls could have been emails.

I lived happily on dialup when I was a teenager, with just one major use case for more bandwidth.

keysdev

Since 2013 I was in a situation where we only has edge internet for half year for 10 ppl. Ever since then I promote text web page. Not everyone has fast Internet.