Skip to content(if available)orjump to list(if available)

The sad state of font rendering on Linux (2018)

jchw

To me one of the most influential pieces of writing about subpixel rendering and in particular an exploration of the ways Microsoft got it wrong was the writings by the late developer of Anti-Grain Geometry, Maxim Shemanarev (R.I.P.)

https://agg.sourceforge.net/antigrain.com/research/font_rast...

Though to be fair to this article, Microsoft did improve things with DirectWrite, and yes the situation on Linux is quite bad unfortunately.

Also bonus, a pretty great article here talking about gamma correctness in font rendering, an issue that is often somewhat overlooked even when it is acknowledged.

https://hikogui.org/2022/10/24/the-trouble-with-anti-aliasin...

Just some additional reading materials if you're interested in this sort of thing.

zekica

Subpixel rendering works completely fine on Linux. I'm using it right now, using "full" hinting and "RGB" subpixel rendering. It even works completely fine with "non-integer" scaling in KDE, even in firefox when "widget.wayland.fractional-scale.enabled" is enabled.

vouaobrasil

On the other hand, subpixel rendering is absent from MacOS and makes it very difficult to using regular ol' 1920x1080 screens with modern MacOS. Yes, those Retina displays look nice, but it's a shame that lower res screens do not because they work perfectly fine except for the font rendering.

nomel

My first (and last) 1920x1080 monitor was 50lb CRT I picked up on the side of the road, in 2003.

I haven't owned a smartphone with a screen resolution that low, in over 10 years.

I think it's an amazing feat of marketing, by display companies, that people still put up with such low resolutions.

lttlrck

It's still a perfectly serviceable resolution.

Of course 16:19 pushed down display costs leading to the demise of 1920x1200 which is unforgivable ;-)

Those 120 pixels were sorely missed.

sam_lowry_

Hm... I am reading this on a 1600x900 screen of my T420s Frankenpad while sitting in dusk in a German campsite. I ordered the screen some 10 years ago off Alibaba or something, and it is exactly the resolution and brightness I need. I hope I will die before this Frankenpad, because contemporary laptops are awful in so many aspects.

You know... as you age, you really can't read all those tiny characters anyway.

vouaobrasil

That's true that they aren't interested. But I still like such screens. I used one quite recently, and it worked just fine for my needs.

null

[deleted]

perching_aix

A Full HD CRT from the roadside in 2003? As if this was just a thing people had happen to them? Is this some elaborate joke I'm missing?

> I haven't owned a smartphone with a screen resolution that low

Smartphone in italics, because smartphones are known for their low pixel densities, right? What?

Did you own a smartphone at all in the past 10 years? Just double checking.

> I think it's an amazing feat of marketing, by display companies, that people still put up with such low resolutions.

And how did you reach that conclusion? Did you somehow miss display companies selling and pushing 1440p and 4K monitors left and right for more than a handful of years at this point, and yet the Steam Hardware Survey still bringing out 1080p monitors as the king month to month?

Sometimes I really do wonder if I live in a different world to others.

msgodel

It works on Xterm for me, I didn't enable anything special except using a vector font.

That's something that OSX doesn't even have now.

neoden

The article is from 2018 and that should be mentioned in the title

mushufasa

I would love to see an update on what has improved and what is the same

tvshtr

On hi-res screens most of it is irrelevant

jeffbee

Yeah even the flag they are talking about doesn't exist in Chrome anymore. Skia is the only text rendering I ever suffer under Linux, so whether or not Skia works properly is the only thing that makes a difference to me.

kccqzy

Yes definitely. I stopped reading after the OS X section because it was clearly talking about a different era.

scblock

Windows has the worst font rendering of all modern operating systems. Wanting anything like Windows font rendering is insane. Windows 10 makes it near impossible to properly turn off subpixel hinting without also turning off all anti-aliasing, which on a QD-OLED screen makes for horrific color fringing. Windows 11 is better, but still pretty weak. Linux is roughly as good as Mac OS, both of which are miles better than Windows.

Mac OS dropped the subpixel garbage (it really is garbage if you're at all sensitive to fringing or use anything other than a standard LCD) in favor of high pixel density screens. Sharp, readable text and zero color fringing. This is the way.

namibj

> (it really is garbage if you're at all sensitive to fringing or use anything other than a standard LCD)

Human eyes have higher spatial brightness resolution than spatial color resolution. At the cost of software complexity and mild computation overhead, a screen like a Bayer matrix or technology-appropriate similar subpixel layouts together with software that properly anti-aliases content by clamping brightness and color resolution separately to appropriate values ensuring the screen will remain capable of showing the limit frequency and that the two limits are sufficiently close to not disturb the eyes/viewer, will result in better viewing than if you lazily forcibly clamp the brightness and color resolution to the same value as Apple did.

If you have a non-"default" screen subpixel layout then you need to remain able to drive each subpixel individually from the computer and to have the antialiasing algorithm be aware of the specific arrangement you have.

And no, until you can point me to a sub-2000$ (and at that price and that poor contrast, a minimum of 120 Hz) 35~55" screen with at least 2500:1 static contrast, a vaguely 16:9 aspect ratio (though I'll accept 4:3 with the same pixel count and density and accordingly scaled dimensions), and at least 10k individually addressed (and anti-aliased onto by the font rendering) horizontal pixels, I'll happily stay with my 11520 horizontal (sub-)pixels that I paid about 700$ for (43", 5000:1 static contrast, 60Hz).

Night_Thastus

With OLEDs with funky pixel layouts starting to become more popular, I hope Windows starts making their system less crap...

not_a_bot_4sho

I suppose this is a subjective area. I would rank Windows on top, Mac as a close second, and Linux ... well, I love Linux for reasons other than UI.

adrian_b

While the font rendering method matters, the differences between operating systems are typically much smaller than the differences in quality between typefaces.

Linux had a very bad appearance in the past with a default configuration, and it still does not look good in most distributions, but that is not due to bad rendering algorithms, but it is due to the fact that the default free typefaces are usually not very good.

For several decades, the first thing that I have always done after installing Linux was to delete all default typefaces and replace them with some high-quality typefaces, most of which I have bought, with a couple taken from a Mac OS and a Windows that I had bought in the past (which I have stopped using many years ago, except for the few typefaces that I have kept from them).

Because of this policy, any text on my Linux computers has always looked much better than on any of the Windows or Mac OS computers that I have used at work.

type0

it's not subjective if you use OLED screen

eddythompson80

I have an OLED desktop monitor and have the same preference order as OP

exe34

Yeah I don't understand this difference of opinion here - Linux looks fine to me, Mac looks pretty and Windows looks like it's been driven over a few times.

null

[deleted]

dartharva

Heavily disagree as a longtime Linux user. I don't know about MacOS but Windows has always had better font rendering than Linux in my experience.

TLLtchvL8KZ

I feel exactly the same, the font rendering in Linux drives me absolutely insane. The amount of hours I've spent over the past 15 years tweaking fontconfig, having to compile special patched packages etc etc doesn't bare thinking about.

I don't even have to edit anything on Windows now, and when I did a few years ago it was only on a clean install going through I think what was called cleartype config, you had a panel of 6 images/samples to choose from and after going through it all everything looked pretty damn perfect.

b3orn

I'm currently using a Mac for private stuff, used to use Linux for work stuff, currently forced to use Windows. My font rendereing ranking is MacOS > Linux > Windows.

tgv

I rate it macOS > Windows > Linux. iOS is pretty good too, mobile Windows wasn't. But I'm only experiencing Linux graphically on a rather old monitor or through a terminal emulator, and macOS and Windows on a nice monitor, so that probably skews my perception. I wonder how many people observe the three OS'es through the same (or very similar) monitors.

forrestthewoods

> in favor of high pixel density screens

I wish I could download an OS update that gave me a high pixel screen! But, uhhh, that’s not how it works.

carlosjobim

Do yourself a favour and go out and purchase a high pixel screen. I don't understand why people who spend a good part of their days in front of a computer (or employ others to do it), but refuse to get quality hardware. Is it worth destroying your eyes, posture, and joints to save this money? Not to speak of just the personal pleasure of using good hardware.

Restaurants spend tens or hundreds of thousands of dollars on equipment so that their staff can work more efficiently. Why doesn't IT people take inspiration from them and get a bit better equipment. It's not a luxury item.

perching_aix

> Is it worth destroying your eyes, posture, and joints to save this money?

Not high enough pixel density causes which one of these again?

> Why doesn't IT people take inspiration from them and get a bit better equipment.

Are we talking about the same field where people are having a competition about who can build the most overpriced and obnoxious sounding keyboards imaginable?

forrestthewoods

I’m currently running a 4K 32” monitor. What exact monitor do you recommend I buy for my Windows desktop?

perching_aix

[flagged]

lp0_on_fire

> The only way for me not to accuse you of just straight up lying is to just wrap my thoughts into the cushion of subjectivity.

This was unnecessary.

perching_aix

I really don't mean it as an insult, it legitimately just comes off like people are lying. The difference in experience is really just that stark that I honestly cannot fathom these claims actually being truthful, certainly not in the absolute sense they are presented. So, subjectivity remains.

omnimus

One important detail is that fonts themselves have their own hinting rendering tables so authors can decide how fonts will be rendered on low dpi screens. This is tedious and expensive. And you guessed it many libre fonts simply dont do it right or have capacity to do it at all.

Thats why there can be quite big quality jump when you compare it to fonts from big design teams from Apple or Microsoft. Not only the font might be a bit worse, the rendering/hinting is often way worse.

layer8

Exactly. This is why whenever the release of a new font is being posted here (usually coding fonts), I end up being not interested due to the lack of pixel-level hinting.

necovek

One thing that used to be possible with Freetype was configuring how "heavy" hinting was: I remember the time when autohinted fonts looked the best with "light" hinting. They were smooth, non-bold and I couldn't see colour fringing either.

You could also set the RGBA vs whatever pixel layout in the same Gnome settings dialog. Easy-peasy adjustment for a VA panel.

After, it was available only in gconf/dconf or a tool like gnome-tweaks or similar.

MacOS is definitely terrible today, but I prefer Linux over Windows still.

ranger207

I think Linux font rendering looks fine (although it has noticeably gotten better since this post was last updated in 2019) but I absolutely agree that MacOS has the worst looking font rendering. And I was using it on a genuine MacBook Pro! Discussions otherwise have convinced my that apparently font rendering just isn't objective but is opinion based

bradfitz

I haven't used displays with under ~215ppi in over 10 years. I find these subpixel opinion discussions still ongoing very... quaint. :)

sunnyps

So you haven't used a 32 inch 4K monitor which is ~135 ppi? What do you get at that size, a 5K or 6K monitor? Not many of those available and they have specific requirements like higher display port or thunderbolt bandwidth.

There's also an entire world of users still on 720p and 1080p displays. They deserve better font rendering even if it doesn't affect us personally.

bradfitz

I haven't. I have 5K and 6K monitors. That's indeed privileged, but only for a bit until that's soon commonplace and cheap. So this all sounds like a very temporary problem at this point for something so subjective. Every time this topic comes up it's the same "but I like the look of $OS rendering the best" comments.

6yyyyyy

High PPI screens have been around for 10 years or so, and they still cost about twice as much as a standard PPI screen the same size.

Put yourself in the shoes of the average computer purchaser: Would you rather buy a high PPI monitor, or two standard PPI monitors? To me this is a no-brainer.

vouaobrasil

Maybe you can afford such a display. But I still like regular HD displays because they are cheap and functional.

RGBCube

Almost MacOS-tier font rendering, for free:

    FREETYPE_PROPERTIES="cff:no-stem-darkening=0 autofitter:no-stem-darkening=0"
Probably only good in high DPI monitors though.

omnimus

Looking ar the comments it seems that it is very subjective. People seem to prefer what they are used to the most.

Fits with me - long time mac user i like Mac rendering, linux feels very similar and i like it. Windows feels like somebody is burning the fonts into lcd. It is probably more legible in tiny sizes on low pixel screens but it is too strong and not very elegant everywhere else.

ndiddy

> The traditional way of achieving this is through installing ttf-mscorefonts-installer or msttcorefonts. The msttcorefonts package looks like some Shenzhen basement knockoff that renders poorly and doesn’t support Unicode. I suspect that these fonts have gone through multiple iterations in every Windows release and that the versions available through the repositories must be from around Windows 95 days.

This is because these font files originate from a Microsoft initiative called "Core Fonts for the web" that ran between 1996 and 2002. Before web fonts became a thing, Microsoft wanted to make a set of broadly available fonts that web designers could assume everyone had on their computers. Because Microsoft cancelled the initiative, the redistributable versions of those fonts are stuck in time. They were last updated around 2000, and any updated versions with further improvements or added characters aren't freely redistributable.

Sunspark

This is OK for me because I use these with full (or medium) hinting and anti-aliasing off in some apps, and greyscale anti-aliasing in other apps with the v35 interpreter.

v40 with slight hinting and greyscale or subpixel works, but I don't tend to use the fonts that are meant to be used with slight hinting and later fonts can't handle anti-aliasing off at all.

You can easily do this on a per-app basis with flatpaks.. you can set an environment variable with flatseal, and you can drop a fontconfig folder with a custom fonts.conf in the flatpak's var directory.

bee_rider

What’s wrong with the v35 freetype picture? He writes like it is immediately obvious, but it seems fine.

jchw

See the jump from 17pt to 18pt? That's wrong. (Also, the small sizes are just completely obliterated IMO.) Font outlines are scalable; they should have the same relative weight no matter what pt/px size you render them at, and they should have the same proportions. Non-scalable rendering is incorrect (although techniques like hinting and gridfitting do intentionally sacrifice scalability for better legibility, but I argue you can do better in most cases.)

zajio1am

Rendering of vector fonts to fixed grid of pixels leads to incorrect results in principle. Introducing blur where there is a sharp edge in vector data is also a wrong result. You can just choose which kind of wrongs is more annoying - whether distortion due to grid-fitting or blur due to naive rendering and antialiasing.

jchw

There is no objectively "best" way to render vector typefaces to a raster, but that's not because all of the options are equally correct, it's because options that are more accurate to a font might look subjectively worse. There's nothing "incorrect" that a raster rendering of a shape can't convey the signal with perfect fidelity, but that doesn't mean that all rendering of vectors to raster is equally correct.

Like fine, let's put aside somewhat intentional things like hinting and grid-fitting with accumulating error for a minute. Some FreeType configurations dramatically fuck up the visual weight of fonts, making the regular style in a type face look fairly bold. The damn font looks wrong. It's not "wrong" as in I disagree with what the designers intended for the type face, it's wrong as in it looks nothing like the designers intended and it looks nothing like the parameters you put in to render the font. There is basically no perspective where this output is desired, it's just a bad rendering.

There's definitely a bit of subjectivity in exactly where to draw the line, but there is definitely still a line you can cross that just goes into blatantly wrong territory. The relative visual weight of a glyph is not supposed to be influenced by its size on screen.

bee_rider

Who cares? That only matters if you have a bizarre document that is incrementing through all the font sizes.

jchw

Well, because it literally distorts the glyphs and thus doesn't actually look right, it would be like if some of the pixels on your screen were inexplicably the wrong color due to a color management issue. In some cases the distortion is really bad and doesn't even really improve legibility at all, so it's just a plain lose/lose. If you don't give a shit about typography in the least and don't care about the visual weight of text then fine, but not caring doesn't mean the behavior is correct by any means. (And keep in mind, you will often have more than one font size of text on screen at once, so this distortion will change the relative weight of fonts incorrectly, aside from also distorting the actual shape of glyphs.)

But OK, other than just being incorrect, does it matter? Many people don't have proper color management in their software and it's usually fine. Well, yes, sometimes it matters. For one thing, this issue really screwed up scaling in Win32 and even GTK+2, because if you tried to render dialogs with different font sizes it would completely change the UI and screw up some of the component sizing. OK, though, you can fix that by just not using a fixed layout. However, you still run into this problem if you want to render something that actually does have a specific layout. The most obvious example of how this can be a serious problem is something like Microsoft Word that is meant to give you a WYSIWYG view of a document on paper, but the paper is 300+ DPI and the poor screen is only 96 DPI.

Maybe most importantly, this is all pointless! We don't actually have to settle for these concessions for Latin script text on 96 DPI screens. Seriously, we really don't. I recommend this (old) article for a dive into the problems caused by non-scalable font rendering and how it could've probably been solved all along:

https://agg.sourceforge.net/antigrain.com/research/font_rast...

(Though to be fair, there are still problems with the approach of vertical-only hinting, as it does cause distortion too.)