Skip to content(if available)orjump to list(if available)

Algorithms for Modern Processor Architectures

appreciatorBus

Looks like this was delivered earlier today at SEA 2025, I hope there's video that will be available soon!

https://x.com/lemire/status/1947615932702200138

curiouscoding

I don't think talks are being recorded, unfortunately.

NooneAtAll3

apple still uses utf16?

vanderZwan

JavaScript does, so the web does, so by extension Apple probably does care about utf16.

jiggawatts

Also: Java, DotNet, and Windows all use 2-byte char types.

markasoftware

is this talk about apple? Regardless, lots of language runtimes still use utf16 (eg java, qt, haskell), and windows certainly still uses utf16.

phkahler

Pentium 4 didn't hit 3.8GHz. It melted at 1.4 or so.

wtallis

The Pentium 3 is what eventually topped out at 1.4 GHz, for the 130nm Tualatin parts introduced in 2001. The Pentium 4 started at 1.4GHz and 1.5GHz with the 180nm Willamette parts introduced in 2000. Those were eventually released with speeds up to 2.0GHz. The 130nm Pentium 4 Northwood reached 3.4GHz in 2004, and the 90nm Pentium 4 Prescott hit 3.8GHz later in 2004.

bayindirh

Intel released a couple of Pentium 4's from different cores topping at 3.8GHz [0].

Tom's Hardware overclocked one of these Northwood Pentium 4's to 5 GHz with liquid nitrogen and a compressor [1].

Those were the days, honestly.

[0]: https://en.wikipedia.org/wiki/Pentium_4

[1]: https://www.youtube.com/watch?v=z0jQZxH7NgM

necubi

The Pentium 4 HT 670, released in 2005, came factory-clocked at 3.8 (https://www.techpowerup.com/cpu-specs/pentium-4-ht-670.c20)

Netburst lasted a long time as intel was floundering, before Core Duo was released in 2006.

IgnaciusMonk

I do not want to be rude but this is exactly why LLVM being in hands of same entity which controls access to / owns platform is insane.

edit - #64 E ! Also, i always say, human body is most error prone measuring device humans have in their disposal.

bayindirh

Both LLVM and GCC is being supported by processor manufacturers directly. Yes, Apple and Intel has their own LLVM versions, but as long as don't break compatibility with GCC and doesn't prevent porting explicitly, I don't see a problem.

I personally use GCC suite exclusively though, and while LLVM is not my favorite compiler, we can thank them for spurring GCC team into action for improving their game.

gleenn

Can you be more explicit? Is it because they are optimizing too much to a single platform that isn't generalizable to other compilers or architectures? What's your specific gripe?

almostgotcaught

Whose hands exactly is LLVM in?

IgnaciusMonk

Also to be more controversial. - redhat deprecated x86_64_v1 & x86_64v2 . and people were crying because of that....

volf_

A commercial enterprise is dropping support for older cpu architectures in their newer OSs so they can improve the average performance of the deployed software?

Don't see how that's controversial. It's something that doesn't matter to their customers or their business.

bayindirh

The newest x86_64-v1 server is older than a decade now, and I'm not sure -v2 is deprecated. RockyLinux 9 is running happily on -v2 hardware downstairs.

Oh, -v2 is deprecated for RH10. Not a big deal, honestly.

From a fleet perspective, I prefer more code uses more advanced instructions on my processors. Efficiency goes up on hot code paths possibly. What's not to love?