Show HN: I spent 6 years building a ridiculous wooden pixel display
benholmen.com
Qwen-Image: Crafting with native text rendering
qwenlm.github.io
Marking 13 Years on Mars, NASA's Curiosity Picks Up New Skills
jpl.nasa.gov
How we made JSON.stringify more than twice as fast
v8.dev
Indian Sign Painting: A typeface designer's take on the craft
bl.ag
Content-Aware Spaced Repetition
giacomoran.com
Once a death sentence, cardiac amyloidosis is finally treatable
nytimes.com
SpaceX's Cellular Starlink Expands to Support IoT Devices
me.pcmag.com
Job-seekers are dodging AI interviewers
fortune.com
AWS European Sovereign Cloud to be operated by EU citizens
aboutamazon.eu
OpenIPC: Open IP Camera Firmware
openipc.org
Perplexity is using stealth, undeclared crawlers to evade no-crawl directives
blog.cloudflare.com
How we built Bluey’s world
itsnicethat.com
What Can a Cell Remember?
quantamagazine.org
A deep dive into Rust and C memory interoperability
notashes.me
Show HN: Sidequest.js – Background jobs for Node.js using your database
docs.sidequestjs.com
Objects should shut the fuck up
dustri.org
Century-old stone “tsunami stones” dot Japan's coastline (2015)
smithsonianmag.com
JSON encoding is a huge impediment to interprocess communication in NodeJS.
Sooner or later is seems like everyone gets the idea of reducing event loop stalls in their NodeJS code by trying to offload it to another thread, only to discover they’ve tripled the CPU load in the main thread.
I’ve seen people stringify arrays one entry at a time. Sounds like maybe they are doing that internally now.
If anything I would encourage the V8 team to go farther with this. Can you avoid bailing out for subsets of data? What about the CString issue? Does this bring faststr back from the dead?