U.S. government takes 10% stake in Intel
cnbc.com
Nitro: A tiny but flexible init system and process supervisor
git.vuxu.org
Will AI Destroy the World Wide Web?
cacm.acm.org
The First Media over QUIC CDN: Cloudflare
moq.dev
Top Secret: Automatically filter sensitive information
thoughtbot.com
Leaving Gmail for Mailbox.org
giuliomagnifico.blog
Launch HN: BlankBio (YC S25) - Making RNA Programmable
The issue of anti-cheat on Linux (2024)
tulach.cc
Should the web platform adopt XSLT 3.0?
github.com
LabPlot: Free, open source and cross-platform Data Visualization and Analysis
labplot.org
Writing Micro Compiler in OCaml (2014)
troydm.github.io
Show HN: Clyp – Clipboard Manager for Linux
github.com
Waymo granted permit to begin testing in New York City
cnbc.com
Closing the Nix gap: From environments to packaged applications for rust
devenv.sh
Launch HN: Inconvo (YC S23) – AI agents for customer-facing analytics
Tesla Hasn't Filed Crash Reports on Time. Federal Investigators Want to Know Why
wsj.com
Our Response to Mississippi's Age Assurance Law
bsky.social
What about using rel="share-url" to expose sharing intents?
shkspr.mobi
Build Log: Macintosh Classic
jeffgeerling.com
Control shopping cart wheels with your phone (2021)
begaydocrime.com
Show HN: Pinch – macOS voice translation for real-time conversations
startpinch.com
Making LLMs Cheaper and Better via Performance-Efficiency Optimized Routing
arxiv.org
The biggest weakness of genetic algorithms is they can't make use of gradients - meaning they have no idea how to 'move' towards the solution - they end up guessing and refining their guesses, which means they're much slower to converge.
Their advantage is they don't require gradients (so the fitness function to be differentiable), but I don't think they're going to be the next big thing.