Skip to content(if available)orjump to list(if available)

Firefox 32-bit Linux Support to End in 2026

e2le

Seems sensible. Only 2.6% of users (with telemetry enabled) are using 32-bit Windows while 6.4% are using 32-bit Firefox on 64-bit Windows[0]. 32-bit Linux might see more use however isn't included in the stats, only 5.3% of users are running Linux[1] and I doubt many enable telemetry.

Maybe they could also drop support for older x86_64 CPU's, releasing more optimised builds. Most Linux distributions are increasing their baseline to x84-64-v2 or higher, most Firefox users (>90%)[0] seem to meet at least x84-64-v2 requirements.

[0]: https://firefoxgraphics.github.io/telemetry/#view=system

[1]: https://firefoxgraphics.github.io/telemetry/#view=general

pigeons

That seems like a lot of people to abandon! Perhaps the right financial decision, I don't know, but that seems like a significant number of users.

pavon

They aren't ending support for 32-bit Windows. If the ratio of 32/64 bit users on Linux matched those on Windows, then this would affect 0.5% of their users.

zozbot234

Does this mean that 32-bit Linux users will be able to run more up-to-date versions using Wine?

gweinberg

Abandon is too strong a word. I imagine most people who are still using 32 bit operating systems aren't too concerned about getting the very latest version of firefox either.

yreg

They might not be concerned, but websites using new standards will slowly start to break for them.

metalliqaz

within those numbers are people who don't really have a preference one way or another, and just didn't bother to upgrade. I have to imagine that the group of people that must use 32-bit and need modern features is vanishingly small.

pdntspa

I would bet a lot of those folks are running embedded linux environments. Kiosks, industrial control, signage, NUCs, etc. I know that as of about 6 years ago I was still working with brand-new 32-bit celeron systems for embedded applications. (Though those CPUs were EOL'd and we were transitioning to 64-bit)

kstrauser

I think that’s the right way to look at it. If you want a 32 bit system to play with as a hobby, you know you’re going to bump into roadblocks. And if you’re using a 20 year old system for non-hobby stuff, you already know there’s a lot of things that don’t work anymore.

waterhouse

"Modern features" are one thing; "security updates" are another. According to the blog post, security updates are guaranteed for 1 year.

pigeons

Its an actual migration to a new platform more than just not bothering to upgrade though.

rolph

some people use older tech, precisely because it is physically incapable of facilitating some inpalatable tech that they dont require.

lou1306

Mozilla is in extremely dire straits right now, so unless this "lot of people" make a concerted donation effort to keep the lights on I would be hardly shocked by the sunsetting.

nicoburns

Dire straights? They had $826.6M in revenue in 2024.

They will be in dire straights if the Google money goes away for some reason, but right now they have plenty of money.

(not that I think it makes any sense for them to maintain support for 32-bit cpus)

kstrauser

I’d have to agree. I doubt there are that many (in relative terms) people browsing the web on 32-bit CPUs and expecting modern experiences. I’ve gotta imagine it would be pretty miserable, what with the inherent RAM limitations on those older systems, and I’m sure JavaScript engines aren’t setting speed records on Pentium 4s.

cosmic_cheese

Yeah consumer CPUs have been 64-bit since what, the PowerPC G5 (2003), Athlon 64 (2003), and Core 2 (2006)? There were a few 32-bit x86 CPUs that were released afterward, but they were things like Atoms which were quite weak even for the time and would be practically useless on the modern internet.

More generally I feel that Core 2 serves as a pretty good line in the sand across the board. It’s not too hard to make machines of that vintage useful, but becomes progressively challenging with anything older.

Sohcahtoa82

For what it's worth, people may have been running 64-bit CPUs, but many were still on 32-bit OSes. I was on 32-bit XP until I upgraded to 64-bit Win7.

kstrauser

I mentioned elswewhere, but Apple started selling 32 bit Core (not Core 2) MacBook Pros in 2006. Those seemed dated even at the time. I’d call them basically the last of the 32 bit premium computers from a major vendor.

Frankly, anything older than that sucks so much power per unit of work that I wouldn’t want to use them for anything other than a space heater.

selectodude

Intel Prescott, so like 2004.

anthk

32 bit Atom netbook. I use offpunk, gopher://magical.fish among tons of services (and HN work straight there) and gemini://gemi.dev over the gemini protocol with Telescope as the client. Mpv+yt-dlp+streamlink complement the video support. Miserable?

Go try browsing the web without UBlock Origin today under an i3.

Wowfunhappy

I haven’t tried it, but as bloated as the web is, I don’t think it’s so bad that you need gigabytes of memory or a blazing fast CPU to e.g. read a news website.

As long as you don’t open a million tabs and aren’t expecting to edit complex Figma projects, I’d expect browsing the web with a Pentium + a lightweight distro to be mostly fine.

Idk, I think this is sad. Reviving old hardware has long been one thing Linux is really great at.

doubled112

Try it and come back to let us know. The modern web is incredibly heavy. Videos everywhere, tons of JavaScript, etc.

My wife had an HP Stream thing with an Intel N3060 CPU and 4GB of RAM. I warned her but it was cheap enough it almost got the job done.

Gmail's web interface would take almost a minute to load. It uses about 500MB of RAM by itself running Chrome.

Does browsing the web include checking your email? Not if you need web mail, apparently.

Check out the memory usage for yourself one of these days on the things you use daily. Could you still do them?

mschuster91

I have a ThinkPad X1 Gen3 Tablet (20KK) here for my Windows needs, my daily driver is a M2 MBA, and my work machine is a 2019 16-inch MBP (although admitted, that beast got an i9...).

Got the Thinkpad for half the ebay value on a hamfest. Made in 2018-ish, i5-8350U CPU... It's a nice thing, the form factor is awesome and so is the built-in LTE modem. The problem is, more than a dozen Chrome tabs and it slows to a crawl. Even the prior work machine, a 2015 MBP, performed better.

And yes you absolutely need a beefy CPU for a news site. Just look at Süddeutsche Zeitung, a reputable newspaper. 178 requests, 1.9 MB, 33 seconds load time. And almost all of that crap is some sort of advertising - and that despite me being an actually subscribed customer with adblock enabled on top of that.

darkmighty

> Maybe they could also drop support for older x86_64 CPU's, releasing more optimised builds

Question: Don't optimizers support multiple ISA versions, similar to web polyfill, and run the appropriate instructions at runtime? I suppose the runtime checks have some cost. At least I don't think I've ever run anything that errored out due to specific missing instructions.

zokier

There was recent story about f-droid running ancient x86-64 build servers and having issues due lacking isa extensions

https://news.ycombinator.com/item?id=44884709

but generally it is rare to see higher than x86-64-v3 as a requirement, and that works with almost all CPUs sold in the past 10+ years (Atoms being prominent exception).

igrunert

A CMPXCHG16B instruction is going to be faster than a function call; and if the function is inlined there's still binary size cost.

The last processor without the CMPXCHG16B instruction was released in 2006 so far as I can tell. Windows 8.1 64-bit had a hard requirement on the CMPXCHG16B instruction, and that was released in 2013 (and is no longer supported as of 2023). At minimum Firefox should be building with -mcx16 for the Windows builds - it's a hard requirement for the underlying operating system anyway.

enedil

Let me play devil's advocate: for some reason, functions such as strcpy in glibc have multiple runtime implementations and are selected by the dynamic linker at load time.

ChocolateGod

They could always just make the updater/installer install a version optimized for the CPU its going to be installed on.

wtallis

As far as I can tell, GCC supports compiling multiple versions of a function, but can't automatically decide which functions to do that for, or how many versions to build targeting different instruction set extensions. The programmer needs to explicitly annotate each function, meaning it's not practical to do this for anything other than obvious hot spots.

null

[deleted]

mort96

You can do that to some limited degree, but not really.

There are more relevant modern examples, but one example that I really think illustrates the issue well is floating point instructions. The x87 instruction set is the first set of floating point instructions for x86 processors, first introduced in the late 80s. In the late 90s/early 2000s, Intel released CPUs with the new SSE and SSE2 extensions, with a new approach to floating point (x87 was really designed for use with a separate floating point coprocessor, with a design that's unfortunate now that CPUs have native floating point support).

So modern compilers generate SSE instructions rather than the (now considered obsolete) x87 instructions when working with floating point. Trying to run a program compiled with a modern compiler on a CPU without SSE support will just crash with an illegal instruction exception.

There are two main ways we could imagine supporting x87-only CPUs while using SSE instructions on CPUs with SSE:

Every time the compiler wants to generate a floating point instruction (or sequence of floating point instructions), it could generate the x87 instruction(s), the SSE instruction(s), and a conditional branch to the right place based on SSE support. This would tank performance. Any performance saving you get from using an SSE instruction instead of an x87 instruction is probably going to be outweighed by the branch.

The other option is: you could generate one x87 version and one SSE version of every function which uses floats, and let the dynamic linker sort out function calls and pick the x87 version on old CPUs and the SSE version on new CPUs. This would more or less leave performance unaffected, but it would, in the worst case, almost double your code size (since you may end up with two versions of almost every function). And in fact, it's worse: the original SSE only supports 32-bit floats, while SSE2 supports 64-bit floats; so you want one version of every function which uses x87 for everything (for the really old CPUs), one version of every function which uses x87 for 64-bit floats and SSE for 32-bit floats, and you want one function which uses SSE and SSE2 for all floats. Oh, and SSE3 added some useful functions; so you want a fourth version of some functions where you can use instructions from SSE3, and use a slower fallback on systems without SSE3. Suddenly you're generating 4 versions of most functions. And this is only from SSE, without considering other axes along which CPUs differ.

You have to actively make a choice here about what to support. It doesn't make a sense to ship every possible permutation of every function, you'd end up with massive executables. You typically assume a baseline instruction set from some time in the past 20 years, so you're typically gonna let your compiler go wild with SSE/SSE2/SSE3/SSE4 instructions and let your program crash on the i486. For specific functions which get a particularly large speed-up from using something more exotic (say, AVX512), you can manually include one exotic version and one fallback version of that function.

But this causes the problem that most of your program is gonna get compiled against some baseline, and the more constrained that baseline is, the more CPUs you're gonna support, but the slower it's gonna run (though we're usually talking single-digit percents faster, not orders of magnitude faster).

nisegami

I consider it unlikely, but perhaps there's some instructions that don't have a practical polyfill for x86?

PhilipRoman

The only thing that comes to mind is some form of atomic instructions that need to interact with other code in well defined ways. I don't see how you could polyfill cmpxchg16b for example.

arp242

> 32-bit Linux might see more use

Probably less, not more. Many distros either stopped supporting 32bit systems, or are planning to. As the announcement says, that's why they're stopping support now.

snackbroken

Less than 2.6% of browser users (with telemetry enabled) are using Firefox. Should the web drop support for Firefox? Seems sensible. (I would hope not)

Ukv

It'd be ~0.1% of Firefox users that use 32-bit Linux, extrapolating from e2le's statistics, not 2.6%. Have to draw the line at some point if an old platform is becoming increasingly difficult to maintain - websites today aren't generally still expected to work in IE6.

zamadatix

Firefox shouldn't need special support by the web, the same relationship can't be said of architecture specific binaries.

postepowanieadm

What's the threshold for minority to be ignored?

RainyDayTmrw

That's surprising. Why is there such a comparatively large number using 32-bit Firefox on 64-bit Windows?

bmicraft

Some people are under the misapprehension that 32-bit programs need less ram, which might explain that, but that's still a large number regardless.

ars

If they are like me, they simply never realized they needed to re-install Firefox after upgrading the OS.

Mozilla should try to automate this switch where the system is compatible to it.

3np

I believe even Raspberry Pi4B and 400 are still only having first-class drivers for 32-bit?

Kiosks and desktops and whatnot on Raspis still on 32-bit and likely to have Firefox without telemetry.

csande17

They edited the article to clarify that they're only dropping support for 32-bit x86.

3np

Good to see

neilv

Seems reasonable by Mozilla, to me, given precedents like the new Debian release not doing 32-bit release builds.

And doing security updates on ESR for a year is decent. (Though people using non-ESR stream builds of Firefox will much sooner have to downgrade to ESR, or be running with known vulnerabilities.)

If it turns out there's a significant number of people who really want Firefox on 32-bit x86, would it be viable for non-Mozilla volunteers to fork the current ESR or main stream, do bugfixes, backport security fixes, and distribute that unofficial or rebranded build?

What about volunteers trying to keep the main stream development backported? Or is that likely to become prohibitively hard at some point? (And if likely to become too hard, is it better to use that as a baseline going forward with maintenance, or to use the ESR as that baseline?)

pavon

The last release to support 32-bit x86 hardware for popular distros was:

  Distro       | Release | Support | Extended Support
  -------------|---------|---------|------------------
  SLES 11      | 2009-03 | 2019-03 | 2022-03 | 2028-03
  RHEL 6       | 2010-11 | 2019-08 | 2024-06 | 2029-05
  Arch         | 2017-11 | *Ongoing releases via unofficial community project
  Ubuntu 18.04 | 2018-04 | 2023-05 | 2028-04 | 2030-04
  Fedora 31    | 2019-10 | 2020-11 | N/A
  Slackware 15 | 2022-02 | Ongoing, this is the most recent release
  Debian 12    | 2023-06 | 2026-06 | 2028-06
  Gentoo       | Ongoing
By the time FireFox 32-bit is dropped, all the versioned distros will be past their general support date and into extended support, leaving Gentoo, Arch32, and a handful of smaller distros. Of course, there are also folks running a 64-bit kernel with 32-bit Firefox to save memory.

Arnavion

>[Updated on 2025-09-09 to clarify the affected builds are 32-bit x86]

That's nice... When this was originally posted on 09-05 it just mentioned "32-bit support", so I'd been worried this would be the end of me using FF on a Microsoft Surface RT (armv7, running Linux).

dooglius

Does this mean they are deleting a bunch of code, or just that people will have to compile it manually? I'd imagine there is a lot of 32-bit specific code, but how much of that is 32-bit-Linux specific code?

Night_Thastus

I'm honestly surprised just about anything supports 32-bit these days.

It's fine to keep hosting the older versions for download, and pointing users to it if they need it. But other than that, I see 0 reason to be putting in literally any effort at all to support 32-bit. It's ancient and people moved on like what, well over a decade and a half ago?

If I were in charge I'd have dropped active development for it probably 10 years ago.

waynesonfire

That's what Google did with Chrome.

fabrice_d

Are you sure about that? There are Android Go targets that are 32-bit to reduce memory usage, even on 64bits CPUs.

fulafel

What is the memory usage difference like?

ars

32 bit Firefox doesn't work anyway. I had an old 32 bit Firefox and didn't change it when I switched to 64 bit installation (I didn't realize).

It crashed NON-STOP. And it would not remember my profile when I shut down, which made the crashes even worse, since I lost anything I was working on.

I finally figured out the problem, switched to 64 bit and it was like magic: Firefox actually worked again.

capitainenemo

I don't know if you can draw any good conclusions from that on a linux install. Seems just as likely that after your 64 bit install switch, 32 bit libraries it was depending on were missing.

Linux distros don't tend to be as obsessive at maintaining full 32 bit support as the Windows world.

A better test would be to fire up a 32 bit VM and see if Firefox 32 bit crashed there...

sfink

If they made it as far as being able to lose something they were working on, then it's less likely to have been a missing library problem. But I don't know what it was; people do successfully run Firefox on 32-bit Linux.

Firefox does have problems with old profiles, though. I could easily see crud building up there. I don't think Firefox is very good about clearing it out (unless you do the refresh profile thing). You could maybe diagnose it with about:memory, if you were still running that configuration.

ars

I tried about:memory, I did not find anything useful. See also my reply here: https://news.ycombinator.com/item?id=45173661

(I'm no longer running it, I switched to 64 bit and was VERY happy to no longer have crashes.)

I technically could re-install the 32 bit one, and try it, but honestly, I don't really want to!

rst

If the libraries were missing entirely, I'm not sure 32-bit Firefox would even start. But if they were present and nothing was keeping them updated (pretty likely on an otherwise 64-bit system), they'd pretty likely become out of date -- which could certainly explain spurious crashes.

capitainenemo

Fair point, although Firefox also launches subprocesses, and I don't know if those use same libraries as the main process. And I also don't know if it dynamically loads supporting libs after launch.

ars

No, this is Debian, it keeps the 32 bit libraries just as in-date as the 64 bit ones. It can handle having both at the same time.

ars

It's Debian, it can handle 32 bit and 64 bit applications at the same time, and the package manager makes sure you have all the dependencies.

I didn't change libraries, it was a gradual switch where you convert applications to 64 bit - and I didn't think to do Firefox, but it wasn't missing 32 bit libraries.

It was simply a profile that I'd been continuously using since approx 2004, and it was probably too large to fit in 32 memory anymore, or maybe Firefox itself needed more memory and couldn't map it. (The system had a 64 bit kernel, so it wasn't low on RAM), but 32 bit apps are limited to 2/3/4GB.

capitainenemo

Possible I suppose. You can restrict firefox memory usage in the config. Perhaps their dynamic allocation was getting confused by what was available on the 64 bit machine? Still. Why would any 32 bit app even try to allocate more than it actually could handle. I dunno. I'm still inclined to think missing libs (or out of date libs) - but hard to say without a bit more detail on the crash. Did anything show up in .xsession-errors / stderr ? Were you able to launch it in a clean profile? Were the crashes visible in about:crashes for the profile when launched in 64 bit? I suppose it doesn't matter too much at this point...

bgirard

Did you attach the debugger and see what it was crashing on?

From when I used to work on performance and reliability at Mozilla, these types of user specific crashes were often caused by faulty system library or anti-virus like software doing unstable injections/hooks. Any kind of frequent crashes were also easier to reproduce and as a result fix.

ars

Here is one of the crash reports: https://crash-stats.mozilla.org/report/index/15999dc2-9465-4...

(I happened to have an un-subitted one which I just submitted, all the other ones I submitted are older than 6 months and have been purged.)

It would crash in random spots, but usually some kind of X, GLX, EGL or similar library.

But I don't think it was GLX, etc, because it also didn't save my profile except very very rarely, which was actually a much worse problem!!

(This crash is from a long time ago, hence the old Firefox version.)

capitainenemo

Is that proprietary nvidia driver stuff in the stack trace?

anthk

OpenBSD I386 user there, atom n270. Anyone who says it's useless... Slashem, cdda:bn, mednafen, Bitlbee, catgirl, maxima+gnuplot, ecl with common lisp, offpunk, mutt, aria2c, mbsync, nchat, MPV+yt-dip+streamlink, tut, dillo, mupdf, telescope plus gemini://gemini.dev and gopher://magical.fish ... work uber fast. And luakit does okish with a single tab.

kstrauser

I think that validates Mozilla’s decision. “See, retro enthusiasts know enough to select alternatives. We’re not stranding them without computers.”

Narishma

Same hardware but I'm using NetBSD instead since Debian dropped 32-bit support.