Consider using Zstandard and/or LZ4 instead of Deflate
50 comments
·August 5, 2025jasonthorsness
DefineOutside
This has been applied to minecraft region files in a fork of paper, which is a type of minecraft server.
https://github.com/UltraVanilla/paper-zstd/blob/main/patches...
e-topy
Instead of using a new PNG standard, I'd still rather use JPEG XL just because it has progressive decoding. And you know, whilst looking like png, being as small as webp, supporting HDR and animations, and having even faster decoding speed.
https://dennisforbes.ca/articles/jpegxl_just_won_the_image_w...
jchw
JPEG XL definitely has advantages over PNG but there is one serious seemingly insurmountable obstacle:
Nothing really supports it. Latest Safari at least has support for it not feature-flagged or anything, but it doesn't support JPEG XL animations.
To be fair, nothing supports a theoretical PNG with Zstandard compression either. While that would be an obstacle to using PNG with Zstandard for a while, I kinda suspect it wouldn't be that long of a wait because many things that support PNG today also support Zstandard anyways, so it's not a huge leap for them to add Zstandard support to their PNG codecs. Adding JPEG-XL support is a relatively bigger ticket that has struggled to cross the finish line.
The thing I'm really surprised about is that you still can't use arithmetic coding with JPEG. I think the original reason is due to patents, but I don't think there have been active patents around that in years now.
Scaevolus
Every new image codec faces this challenge. PNG + Zstandard would look similar. The ones that succeeded have managed it by piggybacking off a video codec, like https://caniuse.com/avif.
jchw
Why would PNG + ZStandard have a harder time than AVIF? In practice, AVIF needs more new code than PNG + ZStandard would.
IshKebab
> I kinda suspect it wouldn't be that long of a wait
Yeah... guess again. It took Chrome 13 years to support animated PNG - the last major change to PNG.
jchw
APNG wasn't part of PNG itself until very recently, so I'd argue it's kind-of neither here nor there.
edoceo
Maybe they were focused on Webp?
bawolff
> The thing I'm really surprised about is that you still can't use arithmetic coding with JPEG.
I was under the impression libjpeg added support in 2009 (in v7). I'd assume most things support it by now.
jchw
Believe it or not, last I checked, many browsers and some other software (file managers, etc.) still couldn't do anything with JPEG files that have arithmetic coding. Apparently, although I haven't tried this myself, Adobe Photoshop also specifically doesn't support it.
greenavocado
That's because people have allowed the accumulation of power and control by Big Tech. Features in and capabilities of end user operating systems and browsers are gate kept by a handful of people in Big Tech. There is no free market there. Winners are picked by politics, not merit. Switching costs are extreme due to vendor lock in and carefully engineered friction.
The justification for WebP in Chrome over JPEG-XL was pure hand waving nonsense not technical merit. The reality is they would not dare cede any control or influence to the JPEG-XL working group.
Hell the EU is CONSIDERING mandatory attestation driven by whitelisted signed phone firmwares for certain essential activities. Freedom of choice is an illusion.
kps
> Nothing really supports it.
Everything supports it, except web browsers.
jchw
JPEG-XL is supported by a lot of the most important parts of the ecosystem (image editors and the major desktop operating systems) but it is a long way away from "everything". Browsers are the most major omission, but given their relative importance here it is not a small one. JPEG-XL is dead in the water until that problem can be resolved.
If Firefox is anything to go off of, the most rational explanation here seems to just be that adding a >100,000 line multi-threaded C++ codebase as a dependency for something that parses untrusted user inputs in a critical context like a web browser is undesirable at this point in the game (other codecs remain a liability but at least have seen extensive battle-testing and fuzzing over the years.) I reckon this is probably the main reason why there has been limited adoption so far. Apple seems not to mind too much, but I am guessing they've just put so much into sandboxing Webkit and image codecs already that they are relatively less concerned with whether or not there are memory safety issues in the codec... but that's just a guess.
Zardoz84
You can use a polyfill.
adzm
Web browsers already have code in place for webp (lossless,vp8) and avif (av1, which also supports animation), as well as classic jpeg and png, and maybe also HEIC (hevc/h265)... what benefit do we have by adding yet another file format if all the use cases are already covered by the existing formats? That said, I do like JPEG-XL, but I also kind of understand the hesitation to adopt it too. I imagine if Apple's push for it continues, then it is just a matter of time to get supported more broadly in Chrome etc.
bawolff
Doesn't PNG have progressive decoding? I.e. adam7 algorithm
layer8
It does, using Adam7: https://en.wikipedia.org/wiki/Adam7_algorithm
The recently released PNG 3 also supports HDR and animations: https://www.w3.org/TR/png-3/
bawolff
> The recently released PNG 3 also supports HDR and animations: https://www.w3.org/TR/png-3/
APNG isn't recent so much as the specs were merged together. APNG will be 21 years old in a few weeks.
duskwuff
Adam7 is interlacing, not progressive decoding (i.e. it cannot be used to selectively decode a part of the image). It also interacts extremely poorly with compression; there is no good reason to ever use it.
arp242
Comparison of "zpng" (PNG wth zstd) and WebP lossless, with current PNG. From https://github.com/WangXuan95/Image-Compression-Benchmark :
Compressed format Compressed size (bytes) Compress Time Decompress Time
WEBP (lossless m5) 1,475,908,700 1,112 49
WEBP (lossless m1) 1,496,478,650 720 37
ZPNG (-19) 1,703,197,687 1,529 20
ZPNG 1,755,786,378 26 24
PNG (optipng -o5) 1,899,273,578 27,680 26
PNG (optipng -o2) 1,905,215,734 4,395 27
PNG (optimize=True) 1,935,713,540 1,120 29
PNG (optimize=False) 2,003,016,524 335 34
Doesn't really seem worth it? It doesn't compress better, and only slightly faster in decompression time.stephencanon
"Only slightly faster in decompression time."
m5 vs -19 is nearly 2.5x faster to decompress; given that most data is decompressed many many more times (often thousands or millions of times more, often by devices running on small batteries) than it is compressed, that's an enormous win, not "only slightly faster".
The way in which it might not be worth it is the larger size, which is a real drawback.
arp242
The difference is barely noticeable in real-world cases, in terms of performance or battery. Decoding images is a small part of loading an entire webpage from the internet. And transferring data isn't free either, so any benefits need to be offset against the larger file size and increased network usage.
fmbb
Win how?
More efficiency will inevitably only lead to increased usage of the CPU and in turn batteries draining faster.
hcs
So someone is going to load 2.5x as many images because it can be decoded 2.5x faster? The paradox isn't a law of physics, it's an interesting observation about markets. (If this was a joke it was too subtle for me)
snickerdoodle12
Might as well just shoot yourself if that's how you look at improvements. The only way to do something good it to stop existing. (this is a general statement, not aimed at you or anyone in particular)
bobmcnamara
Am I reading those numbers right? That's like 25x faster compression than WEBP-M1, there's probably a use case for that.
arp242
The numbers seem small enough that it will rarely matter, but I suppose there might be a use case somewhere?
But lets be real here: this is basically just a new image format. With more code to maintain, fresh new exciting zero-days, and all of that. You need a strong use case to justify that, and "already fast encode is now faster" is probably not it.
scott_w
I don’t think it’s quite as bad, though? It’s using a known compression library that (from reading other comments) has seen use and testing. The rest of PNG would remain unchanged, as the decompression format is a plugin.
I know it needs to be battle tested as a single entity but it’s not the same as writing a new image format from scratch.
zX41ZdbW
Very reasonable.
I've recently experimented with the methods of serving bitmaps out of the database in my project[1]. One option was to generate PNG on the fly, but simply outputting an array of pixel color values over HTTP with Content-Encoding: zstd has won over PNG.
Combined with the 2D-delta-encoding as in PNG, it will be even better.
bawolff
I think there is a benefit to knowing that if you have a png file it works everywhere that supports png.
Better to make the back compat breaks be entirely new formats.
hughw
Related: what's the status of content negotiation? Any browsers use it seriously, and has it been successful? If so, then why not zpng.
willvarfar
We ought consider using QOI instead.
QOI is often equivalent or better compression than PNG, _before_ you even compress it with something like LZ4 etc.
Compressing QOI with something like LZ4 would generally outperform PNG.
HocusLocus
The reason we have a world full of .gif today is that the .png committee rejected animation back when everyone was saying PNG would be the "GIF killer". Just sayin'. Don't hold your breath.
edoceo
Remember this: https://burnallgifs.org/
privatelypublic
Does deflate lead the pack in any metric at all anymore? Only one I can think of is extreme low spec compression (microcontrollers).
JoshTriplett
The only metric deflate leads on is widespread support. By any other metric, it has been superseded.
atiedebee
I'd assume memory usage as well, because it has a tiny context window compared to zstd
JoshTriplett
You can change the context window of zstd if you want. But yes, the default context window size for zstd is 8MB, versus 32k.
adgjlsfhk1
Even there, LZ4 is probably better.
hinkley
You think LZ4 is more portable than zlib? I'm gonna need some citations on that.
zlib is 30 years old, according to Wikipedia. And that's technically wrong since 'zlib' was factored out of gzip (nearly 33 years old) for use in libpng, which is also 30 years old.
duskwuff
A basic LZ4 decompressor is on the order of a few dozen lines of code. It's exceptionally easy to implement.
adgjlsfhk1
not more portable, but probably faster in resource constrained environments
encom
(2021)
In my opinion PNG doesn't need fixing. Being ancient is a feature. Everything supports it. As much as I appreciate the nerdy exercise, PNG is fine as it is. My only gripe is that some software writes needlessly bloated files (like adding a useless alpha channel, when it's not needed). I wish we didn't need tools like OptiPNG etc.
ori_b
Yes. One of the best features of png is that I don't have to wonder if it's going to work somewhere. Throwing that away in favor of a bit of premature optimization seems like a big loss. Especially as this wouldn't be the only modernized image compression format out there. Why use this over, eg, lossless webp?
I don't think I have ever noticed the decode time of a png.
heinrich5991
Most of the comments on that issue are from this year.
One of the interesting features of ZStandard is the support for external dictionaries. It supports "training" a dictionary on a set of samples, of whatever size (16KiB, 64 KiB, etc.), then applying that dictionary as a separate input file for compression and decompression. This lets you compress short content much more effectively.
I doubt it would apply to PNG because of the length and content doesn't seem to be dictionary-friendly, but it would be interesting to try from some giant collection of scraped PNGs. This approach was important enough for Brotli to include a "built-in" dictionary covering HTML.