Qwen3-Omni: Native Omni AI model for text, image and video
github.com
Choose Your Own Adventure
filfre.net
Fine-grained HTTP filtering for Claude Code
ammar.io
Cap'n Web: a new RPC system for browsers and web servers
blog.cloudflare.com
Gauntlet AI (YC S17) Master building with AI, get $200k+ job. All expenses paid
apply.gauntletai.com
OpenAI and Nvidia announce partnership to deploy 10GW of Nvidia systems
openai.com
Why haven't local-first apps become popular?
marcobambini.substack.com
Categorical Foundations for Cute Layouts
research.colfax-intl.com
Diffusion Beats Autoregressive in Data-Constrained Settings
blog.ml.cmu.edu
I Was a Weird Kid: Jailhouse Confessions of a Teen Hacker
bloomberg.com
Rand Paul: FCC chair had "no business" intervening in ABC/Kimmel controversy
arstechnica.com
Unweaving warp specialization on modern tensor core GPUs
rohany.github.io
Is a movie prop the ultimate laptop bag?
blog.jgc.org
A board member's perspective of the RubyGems controversy
apiguy.substack.com
AI-generated “workslop” is destroying productivity?
hbr.org
I'm spoiled by Apple Silicon but still love Framework
simonhartcher.com
PlanetScale for Postgres is now GA
planetscale.com
Cloudflare is sponsoring Ladybird and Omarchy
blog.cloudflare.com
Transforming recursion into iteration for LLVM loop optimizations
dspace.mit.edu
What happens when coding agents stop feeling like dialup?
martinalderson.com
Beyond the Front Page: A Personal Guide to Hacker News
hsu.cy
I fail to understand why we would lack data. Sure, there is limited (historical) text, but if we just open up all available video, and send out interactive robots into the world, we'll drown in data. Then there is simulated data, and tons of sensors that can capture vast amounts of even more data.
Edit: from the source [1], this quote pretty much sums it all up: "Our 2022 paper predicted that high-quality text data would be fully used by 2024, whereas our new results indicate that might not happen until 2028."
[1] https://epoch.ai/blog/will-we-run-out-of-data-limits-of-llm-...