Skip to content(if available)orjump to list(if available)

Memory safety for web fonts

Memory safety for web fonts

229 comments

·March 19, 2025

declan_roberts

> Merely keeping up with the stream of issues found by fuzzing costs Google at least 0.25 full time software engineers

I like this way of measuring extra work. Is this standard at Google?

pradn

Yes, a SWE-year is a common unit of cost.

And there are internal calculators that tell you how much CPU, memory, network etc a SWE-year gets you. Same for other internal units, like the cost of a particular DB.

This allows you to make time/resource tradeoffs. Spending half an engineer’s year to save 0.5 SWE-y of CPU is not a great ROI. But if you get 10 SWE out of it, it’s probably a great idea.

I personally have used it to argue that we shouldn’t spend 2 weeks of engineering time to save a TB of DB disk space. The cost of the disk comes to less than a SWE-hour per year!

jorvi

Note that this can lead to horrid economics for the user.

An example being Google unilaterally flipping on VP8/VP9 decode, which at that time purely decoded on the CPU or experimentally on the GPU.

It saved Google a few CPU cycles and some bandwidth but it nuked every user's CPU and battery. And the amount of energy YouTube consumed wholesale (so servers + clients) skyrocketed. Tragedy of the Commons.

pradn

There's definitely some nuance around how many resources to consume at the client vs the server. Video decoding and ML inference are probably at the extreme end of what you can make a client do.

On the whole, since clients are so constrained, it usually pays to be efficient there - make websites load quickly to increase revenue, require only weak hardware to get more game/app sales, etc. Clients are also untrusted, so there's so many things you can only do on the backend.

izacus

How is that related to the tradeoff calculation at hand?

Implementing VP9, enabling it, transcoding videos and testing it COST SWE hours, it didn't save them. It also cost resources.

In what way did you envision the VP9 issue being related to SWE/resource hour computations here?

crazysim

It also saved licensing costs to MPEG-LA too I guess.

okeuro49

Do you have an article about this? What is the state now?

nairb774

FTE. Full time equivalent. Mosts costs are denominated in FTE - headcount as well as things like CPU/memory/storage/...

The main economic unit for most engineers is FTE not $.

sanbor

I’m pretty sure FTE stands for full-time employee

jdlshore

When used to compare costs, “equivalent” is used instead of “employee.”

“The cost of one full-time employee is equivalent to X CPUs.” —> “The full-time equivalent is X CPUs.”

h3half

This is also extremely extremely common in engineering services contracts, both for government and private sector contracting. RFPs (or their equivalent) will specifically lay out what level of effort is expected and will often denote effort in terms of FTE

fph

Technically a measure of power, work/time.

summerlight

Yeap, kinds of. It's preferred because whenever you propose/evaluate some changes, you can have a rough idea whether it was worth the time. Like you worked on some significant optimization then measure it and justify like the saving was 10 SWE-years where you just put 1 SWE-quarter.

adam_gyroscope

Yep, often things are measured in FTE or FTE-equivalent units. It’s not precise of course but is a reasonable shorthand for the amount of work required.

kiicia

Yes, all those well paid C-level managers cannot handle multiple units so they require everyone to use one “easy to understand unit so that everything is easy to compare and micromanage”

Someone

If you want to decide which of several options is better, how do you propose doing that without using a single number? You can’t, in general, compare multi dimensional quantities.

kiicia

I never said that idea of using single number is bad in itself, what is bad is forcing this single number and it's calculation on everyone else like it is some biblical revelation instead of calculating it ad-hoc (and modifying calculation as need requires), I can only hope c-suites can add and multiply

as for if you can compare multidimensional quantities - of course you can do it and it is done every day with engineering and medical data, it's just tad more complicated than adding and multiplying so it's a no-go for c-suite

you cannot expect people earning six figures to understand actual math, can you?

MForster

It's quite the opposite. This is intended for engineers to make good trade-off decisions as a rule of thumb without financial micromanaging.

whazor

I think this means the engineer fuzzes 4 projects?

colejohnson66

It means it costs them three months per year per employee. So, 'n' employees, n/4 years of man-power is spent fixing issues found by fuzzing. As others have said, FTE (full-time equivalent) is the more common name.

maxdamantus

I hope that if we switch away from FreeType, we'll still have a way to use TTF hinting instructions fully.

Windows/macOS don't seem to have a way to enable proper hinting anymore [0], and even FreeType (since 2.7 [1]) defaults to improper hinting (they call it "subpixel hinting", which doesn't make sense to me in theory, and practically still seems blurry, as if it's unhinted).

In case anyone's wondering what properly hinted text looks like, here's a screenshot [2]. This currently relies on setting the environment variable `FREETYPE_PROPERTIES=truetype:interpreter-version=35`, possibly some other configuration through fontconfig, and using a font with good hinting instructions (eg, DejaVu Sans and DejaVu Sans Mono in the screenshot).

My suspicion is that Windows moved away from font hinting after XP because it's very hard to create fonts with good sets of hinting instructions (aiui, OTF effectively only allows something like "autohinting"), so in the modern world of designer fonts it's easier to just have everyone look at blurry text. Some other minor reasons would be that UI scaling in Windows sometimes (or always?) introduces blurring anyway, and viewing raster images on screens of varying resolutions also introduces scaling blur.

[0] Windows still has a way to enable it, but it disables antialiasing at the same time. This is using an option called "Smooth edges of screen fonts" in "Performance Options"). This basically makes it look like XP, which imo is an improvement, but not as good as FreeType which can do hinting and antialiasing at the same time.

[1] https://freetype.org/freetype2/docs/hinting/subpixel-hinting...

[2] https://gist.githubusercontent.com/Maxdamantus/3a58d8e764b29...

AndriyKunitsyn

>In case anyone's wondering what properly hinted text looks like, here's a screenshot

I'm not an expert, but - I'm sorry, it's not.

The point of hinting is to change the shape of the glyphs so the rasterized result looks "better". What "better" is, of course, purely subjective, but most people would agree that it's better when perceived thicknesses of strokes and gaps are uniform, and the text is less blurry, so the eye can discern the shapes faster. I don't think that your rendering scores high points in that regard.

I'll take a phrase from your rendering: "it usually pays" [0]. I don't like it, I'm sorry. The hinter can't make up its mind if the stroke width should be two pixels wide, or three pixels with faint pixels on the sides and an intense one in the center - therefore, the perceived thicknesses vary, which increases eye strain; "l"s are noticeably different; "ys" at the end clumped together into one blurry thing; and there's a completely unnecessary line of faint pixels on the bottom of the characters, which hinting should have prevented.

The second line is how it looks on Windows on 150% scale. Verdana is a different font, so it's an unfair comparison (Verdana was designed specifically to look good on low resolutions), and the rainbows can be off-putting, but I still think the hinter tucks the shapes into pixels better.

Maybe I don't understand something, or maybe there's a mistake.

[0] https://postimg.cc/cKCQR60F

maxdamantus

I'm not entirely sure how you got that first line, but if it's derived from my image, your system must be scaling the image, which introduces blur. Since you mentioned a 150% scale, I'm guessing your image viewer is rendering each pixel in my image as 1.5 pixels (on average) on your screen, which will explain the scaling/blurring, making it difficult to demonstrate proper hinting on your screen with raster images (I alluded to this in my previous post).

Here's an updated version of your image, with the actual pixel data from my image copied in, at 8x and 1x scale [0]. It should be possible to see the pixels yourself if you load it into a tool like GIMP, which preserves pixels as you zoom in.

It should be fairly clear from the image above that the hinting causes the outlines (particularly, horizontal and vertical lines) to align to the pixel grid, which, as you say, both makes the line widths uniform and makes the text less blurry (by reducing the need for antialiasing; SSAA is basically the gold standard for antialiasing, which involves rendering at a higher resolution and then downscaling, meaning a single physical pixel corresponds to an average of multiple pixels from the original image).

Out of interest, I've done a bit of processing [1] to your image to see what ClearType is actually doing [2], and as described in the FreeType post I linked, it seems like it is indeed using vertical hints (so the horizontal lines don't have the colours next to them—this is obvious from your picture), and it seems like it is indeed ignoring any horizontal hints, since the grey levels around the vertical lines are inconsistent, and the image still looks horizontally blurry.

I might as well also link to this demo I started putting together a few years ago [3]. It uses FreeType to render with hinting within the browser, and it should handle different DPIs as long as `devicePixelRatio` is correct. I think this should work on desktop browsers as long as they're scaling aware (I think this is the case with UI scaling on Windows but not on macOS). Mobile browsers tend to give nonsense values here since they don't want to change it as the user zooms in/out. Since it's using FreeType alone without something like HarfBuzz, maybe some of the positioning is not optimal.

[0] https://gist.githubusercontent.com/Maxdamantus/3a58d8e764b29...

[1] After jiggling the image around and converting it back from 8x to 1x, I used this command to show each RGB subpixel in greyscale (assuming the typical R-G-B pixel layout used on LCD computer monitors):

    width=137; height=53; stream /tmp/image_1x.png - | xxd -p | sed 's/../&&&/g' | xxd -r -p | convert -size $((width*3))x$((height)) -depth 8 rgb:- -interpolate nearest-neighbor -interpolative-resize $((100*8))%x$((300*8))% /tmp/image_8x_rgb.png
[2] https://gist.githubusercontent.com/Maxdamantus/3a58d8e764b29...

[3] https://maxdamantus.eu.org/ftv35-220719/site/

AndriyKunitsyn

>I'm guessing your image viewer is rendering each pixel in my image as 1.5 pixels

Gosh, you are right, I'm so sorry, it really was my PC. Yes, it really is much nicer once I looked at the real pixels of your screenshot. Thank you for the answer.

raggi

I really want hinting, subpixel and anti-aliasing available on all systems, and i want to pick the appropriate set of techniques based on the dpi and font size to minimize error, fuzzy excesses, and balance computation cost. Obviously we still don't all have high DPI all the time and likely won't for a long while. Apple dropping support was a disaster, and I currently run Apple devices plugged into to regular DPI screens with a slightly downsized resolution to trick it into rendering in a sane way, but it's a bonkers process that's also expensive and unnecessary for non-font paint work.

That said, one of the horrors of using the old font stacks, and in many ways the very thing that drove me away from fighting Linux desktop configurations for now about 10y of freedom from pain, was dealing with other related disasters. First it's a fight to even get to things being consistent, and as seen in your screenshot there's inconsistency between the content renders, the title, and the address bar. Worse though the kerning in `Yes` in your screenshot is just too bad for constant use for me.

I hope as we usher in a new generation of font tech, that we can finally reach consistency. I really want to see these engines used on Windows and macOS as well, where they're currently only used as fall-back, because I want them to present extremely high quality results on all platforms, and then I can use them in desktop app work and stop having constant fights with these subsystems. I'm fed up with it after many decades, I just want to be able to recommend a solution that works and have those slowly become the consistently correct output for everyone everywhere.

emidoots

If you want consistency, you only need to convince everyone to switch to a single font renderer (e.g. freetype). That won't happen, though, because OS font renderers have quirks that cause them to render the same things in subtly different ways, and users have come to unintentionally expect those quirks. Even if rendering is 'better' in one app.. if it doesn't match the others or what the user is used to.. then it won't 'feel native'.

Maybe if what freetype is pushing for (fonts are WASM binaries) continues to take hold, and encompass more of fonts, we'll find more consistency over time though

drott

Yes, Skrifa executes TrueType hints and has a new autohinting implementation written in Rust. We use these modes in Chrome.

maxdamantus

Hmm.. I tried using the "tools/viewer/viewer --slide GM_typeface_fontations_roboto" example in the skia repository earlier (swapping out the Roboto font for DejaVuSans, since the Roboto font doesn't seem to properly hint horizontally [0]), but the result [1] seems to only be hinted vertically, so similar to the FreeType "v40" interpreter, which supposedly ignores horizontal hints.

Admittedly I haven't looked into how the setup is configured, and haven't tried it in Chrome, so maybe it's still possible to enable full hinting as intended by the font somehow.

[0] https://gist.githubusercontent.com/Maxdamantus/3a58d8e764b29...

[1] https://gist.githubusercontent.com/Maxdamantus/3a58d8e764b29...

drott

In the viewer, press '/' and configure hinting to full, or modify the source in gm/fontations.cpp and call SkFont::setHinting(SkFontHinting::kFull) or kNormal.

nyanpasu64

> In addition, before the integration into Chromium, we ran a wide set of pixel comparisons in Skia, comparing FreeType rendering to Skrifa and Skia rendering to ensure the pixel differences are absolutely minimal, in all required rendering modes (across different antialiasing and hinting modes).

I'm hoping (but not sure) Skrifa will support hinting (though I'm not sure how it interacts with fontconfig). I noticed your screenshot uses full hinting (a subjective choice I currently don't use on my machines), presumably with integer glyph positioning/advance which isn't scale-independent (neither is Windows GDI), though this is quite a nonstandard configuration IMO.

Keyb0ardWarri0r

This is the true power of Rust that many are missing (like Microsoft with its TypeScript rewrite in Go): a gradual migration towards safety and the capability of being embedded in existing project.

You don't have to do the Big Rewrite™, you can simply migrate components one by one instead.

TheCoreh

> like Microsoft with its TypeScript rewrite in Go

My understanding is that Microsoft chose Go precisely to avoid having to do a full rewrite. Of all the “modern” native/AoT compiled languages (Rust, Swift, Go, Zig) Go has the most straightforward 1:1 mapping in semantics with the original TypeScript/JavaScript, so that a tool-assisted translation of the whole codebase is feasible with bug-for-bug compatibility, and minimal support/utility code.

It would be of course _possible_ to port/translate it to any language (Including Rust) but you would essentially end up implementing a small JavaScript runtime and GC, with none or very little of the safety guarantees provided by Rust. (Rust's ownership model generally favors drastically different architectures.)

jeppester

As I understood their arguments it was not about the effort needed to rewrite the project.

It was about being able to have two codebases (old and new) that are so structurally similar, that it won't be a big deal to keep updating both

IshKebab

No, it was absolutely about the effort needed to rewrite the project. They couldn't afford a rewrite, only a port. They're not going to keep maintaining the Typescript version once they have transitioned to the Go version.

mdriley

Hi, I lead Chrome's Rust efforts. I think the Typescript folks made a great and well-reasoned decision.

aapoalas

Thank you, it's really nice seeing cooler heads prevail on the question of "why didn't they build in my favourite thing X?"

In entirely unrelated news, I think Chrome should totally switch engines from V8 to a Rust built one, like hmm... our / my Nova JavaScript engine! j/k

Great stuff on the font front, thank you for the articles, Rust/C++ interop work, and keep it up!

K0nserv

In a similar way Rust can be very useful for the hot path in programs written in Python, Ruby, etc. You don't have to throw out and rewrite everything, but because Rust can look like C you can use it easily anywhere C FFI is supported.

steveklabnik

> like Microsoft with its TypeScript rewrite in Go

Go is also memory safe.

gpm

I'd argue technically not due to data races on interface values, maps, slices, and strings... but close enough for almost all purposes.

PS. Note that unlike most languages, a datarace on something like an int in go isn't undefined behavior, just non-deterministic and discouraged.

steveklabnik

Yes, these issues are real, but as you say, it's not really the same as UB. As such, Go is generally considered a MSL.

For anyone not familiar with this, see https://go.dev/ref/mem#restrictions

Incidentally, Java is very similar: https://docs.oracle.com/javase/specs/jls/se8/html/jls-17.htm...

GaggiX

Go can have data races, so I would not consider it memory safe.

tedunangst

Since you've mentioned that you never see the annoying strike force threads that others complain about, you're in one.

steveklabnik

I said that I do not see them happening at the rate that people say they happen. I never said they don't happen.

Keyb0ardWarri0r

But can't be embedded in other projects as easily as Rust (FFI, WASM).

steveklabnik

I don't disagree, but "not as easily" is different than "cannot be."

null

[deleted]

raggi

Odd comparison / statement in the context of MS rewriting GDI in Rust

wavemode

You're saying choosing Go over Rust was a mistake? Why?

timewizard

"If you build it they will come."

AndriyKunitsyn

So, there's Skia. Skia is a high-level library that converts texts to glyph indices, does high-level text layout, and caches glyphs (and it also makes GPU calls). But the actual parsing of the font file and converting glyphs to bitmaps happens below in FreeType.

Skia is made in C++. It's made by Google.

There's FreeType. It actually measures and renders glyphs, simultaneously supporting various antialiasing modes, hinting, kerning, interpreting TrueType bytecode and other things.

FreeType is made in C. It's not made by Google.

Question: why was it FreeType that got a Rust rewrite first?

raggi

It has a smaller API surface into the consuming applications and platforms.

Skia tendrils run deep and leak all over the place.

There's also a different set of work to invest in, next-generation Skia is likely to look quite different, moving much of the work on to the GPU, and this work is being researched and developed: https://github.com/linebender/vello. Some presentations about this work too: https://youtu.be/mmW_RbTyj8c https://youtu.be/OvfNipIcRiQ

interroboink

Perhaps since FreeType is the one handling the more untrusted inputs (the font files themselves, downloaded from who-knows-where), it is more at-risk and thus stood to benefit from the conversion more?

But I don't really know anything about how all the parts fit together; just speculating.

londons_explore

Skia's inputs are relatively less complex, so there is less risk of dangerous corner cases.

bawolff

Format parsing is generally considered some of the most risky type of code to have for memory safety. Skia is probably considered a less risky problem domain.

SquareWheel

I've recently been learning about how fonts render based on subpixel layouts in monitor panels. Windows assumes that all panels use RGB layout, and their ClearType software will render fonts with that assumption in mind. Unfortunately, this leads to visible text fringing on new display types, like the alternative stripe pattern used on WOLED monitors, or the triangular pattern used on QD-OLED.

Some third-party tools exist to tweak how ClearType works, like MacType[1] or Better ClearType Tuner[2]. Unfortunately, these tools don't work in Chrome/electron, which seems to implement its own font rendering. Reading this, I guess that's through FreeType.

I hope that as new panel technologies start becoming more prevalent, that somebody takes the initiative to help define a standard for communicating subpixel layouts from displays to the graphics layer, which text (or graphics) rendering engines can then make use of to improve type hinting. I do see some efforts in that area from Blur Busters[3] (the UFO Test guy), but still not much recognition from vendors.

Note I'm still learning about this topic, so please let me know if I'm mistaken about any points here.

[1] https://github.com/snowie2000/mactype

[2] https://github.com/bp2008/BetterClearTypeTuner

[3] https://github.com/microsoft/PowerToys/issues/25595

wkat4242

I'm pretty sure windows dropped subpixel anti-aliasing a few years ago. When it did exist there was a wizard to determine and set the subpixel layout.

Personally I don't bother anymore anyway since I have a HiDPI display (about 200dpi, 4K@24"). I think that's a better solution, simply have enough pixels to look smooth. It's what phones do too of course.

ghusbands

To be clear: Windows still does subpixel rendering, and the wizard is still there. The wizard has not actually worked properly for at least a decade at this point, and subpixel rendering is always enabled, unless you use hacks or third-party programs.

layer8

DirectWrite doesn’t apply subpixel anti-aliasing by default [0], and an increasing number of applications use it, including Microsoft Office since 2013. One reason is the tablet mode starting with Windows 8, because subpixel ClearType only works in one orientation. Nowadays non-uniform subpixel layouts like OLED panels use are another reason.

[0] https://en.wikipedia.org/wiki/ClearType#ClearType_in_DirectW...

cosmic_cheese

macOS dropped it a few years ago, primarily because there are no Macs with non-HiDPI displays any more (reducing benefit of subpixel AA) and to improve uniformity with iOS apps running on macOS via Catalyst (iOS has never supported subpixel AA, since it doesn’t play nice with frequently adjusted orientations).

Windows I believe still uses RGB subpixel AA, because OLED monitor users still need to tweak ClearType settings to make text not look bad.

grishka

> because there are no Macs with non-HiDPI displays any more

That is not true. Apple still sells Macs that don't come with a screen, namely Mac Mini, Mac Studio, and Mac Pro. People use these with non-HiDPI monitors they already own all the time.

hnuser123456

I didn't have to do any cleartype tuning on my LG C2. But maybe since it's a TV they have room for conventional subpixel layouts.

wkat4242

ohh maybe it was macOS I am confused with here. Sorry. I use every OS under the sun together.

Or it could have been the DirectWrite thing. I just don't remember where I read it.

I always thought cleartype was ugly by the way.

hnuser123456

It absolutely still does subpixel AA. Take a screenshot of any text and zoom way in, there's red and blue fringing. And the ClearType text tuner is still a functional builtin program in Win11 24H2.

nine_k

I still have subpixel antialiasing on when using a 28" 4K display. It's the same DPI as a FHD 14" display, typical on laptops. Subpixel AA makes small fonts look significantly more readable.

But it only applies to Linux, where the small fonts can be made look crisp this way. Windows AA is worse, small fonts are a bit more blurred on the same screen, and amcOS is the worst: connecting a 24" FHD screen to an MBP ives really horrible font rendering, unless you make fonts really large. I suppose it's because macOS does not do subpxel AA at all, and assumes high DPI screens only.

Clamchop

As far as I'm aware, ClearType is still enabled by default in Windows.

Subpixel text rendering was removed from MacOS some time ago, though, presumably because they decided it was not needed on retina screens. Maybe you're thinking of that?

perching_aix

It didn't. Some parts of the UI are using grayscale AA, some are on subpixel AA. And sometimes it's just a blur, to keep things fun I guess.

Pretty sure phones do grayscale AA.

tadfisher

The standard is EDID-DDDB, and subpixel layout is a major part of that specification. However I believe display manufacturers are dropping the ball here.

https://glenwing.github.io/docs/VESA-EEDID-DDDB-1.pdf

kiicia

For me, being old time user, (ab)using any subpixel layouts for text rendering and antialiasing is counterproductive and (especially with current pixel densities, but also in general) introduces much more issues that it actually ever solved

“Whole pixel/grayscale antialiasing” should be enough and then specialized display controller would handle the rest

tadfisher

Agreed, but layouts such as Pentile don't actually have all three subpixel components in a logical pixel, so you'll still get artifacts even with grayscale AA. You can compensate for this by masking those missing components.

https://github.com/snowie2000/mactype/issues/932

kiicia

surprising info, I thought this was supposed to be the part about "display controller taking care of any additional issues", thanks for link with details, will read it with interest

TheRealPomax

Mandatory reading when getting into this topic: http://rastertragedy.com/

cosmic_cheese

I may be totally off the mark here, but my understanding is that the alternative pixel arrangements found in current WOLED and QD-OLED monitors are suboptimal in various ways (despite the otherwise excellent qualities of these displays) and so panel manufacturers are working towards OLED panels built with traditional RGB subpixel arrangements that don’t forfeit the benefits of current WOLED and QD-OLED tech.

That being the case, it may end up being that in the near future, alternative arrangements end up being abandoned and become one of the many quirky “stepping stone” technologies that litter display technology history. While it’s still a good idea to support them better in software, that might put into context why there hasn’t been more efforts put into doing so.

RKFADU_UOFCCLEL

Windows has always allowed you to change subpixel layout, its right there in the clear type settings.

zozbot234

Sub-pixel anti-aliasing requires outputing a pixel-perfect image to the screen, which is a challenge when you're also doing rendering on the GPU. You generally can't rely on any non-trivial part of the standard 3D-rendering pipeline (except for simple blitting/compositing) and have to use the GPU's compute stack instead to address those requirements. This adds quite a bit of complexity.

null

[deleted]

K0nserv

I appreciate the pun in the repository name https://github.com/googlefonts/fontations/

sidcool

This is a wonderful write up. Reminiscent of the old google

xyst

G engineering write ups are usually well written with plenty of useful information to carry forward.

It’s G’s _business_ folks (ie, C-level executives) that I have no respect for. Their business model of exploiting users is just awful.

tsuru

It looks like there is an extern C interface... I wonder if it is everything necessary for someone to use it via FFI.

steveklabnik

Given that it's being used in a large C++ codebase, I would assume it has everything needed to use it in that API.

pornel

They just need to rewrite the rest of Chrome to use the native Rust<>Rust interface.

(in reality Google is investing a lot of effort into automating the FFI layer to make it safer and less tedious)

sam0x17

TIL fonts have more than just a collection of vectorized glyphs in them

londons_explore

This is the kind of work that Brave et al will never do.

codedokode

> Fonts are passed through the OpenType Sanitizer prior to processing.

Are font formats so bad that the files need to be sanitized?

Also, note that the identified integer overflows as one of causes of vulnerabilities. It is sad that today many languages do not detect overflows, and even modern architectures like RISC-V do not include overflow traps although detecting an overflow doesn't require many logic gates. C is old, ok, but why new languages like Rust do not have overflow traps, I cannot understand. Don't Rust developers know about this thing?

steveklabnik

Rust traps on overflow in debug mode, and does two’s compliment overflow in release mode. You can turn the traps on in release if you wish, but the cost is deemed too expensive to do so by default, especially when bounds are always checked, so it isn’t as severe of an issue in Rust.

codedokode

This becomes a vicious circle. Language developers do not want to make the language safer because legacy CPUs do not support overflow trapping, and CPU designers do not bother to add it because nobody needs it. For example, RISC-V spec says that they decided to not add overflow trapping because it is "easy" to do in 3 or 4 existing instructions.

snvzz

>RISC-V spec says that they decided to not add overflow trapping because it is "easy" to do in 3 or 4 existing instructions.

It is not just "easy" to do, but actually easy to do, without quotes.

And much, much easier than as an exception/trap.

And it is explicit, which makes it even better. No hidden behaviour.

jeffbee

I wonder what this means, if anything, for the future of WUFFS. TTF support is on their roadmap. If they can get acceptable performance and safety from Rust, will they still drive on with WUFFS?