NPM Package with 56K Downloads Caught Stealing WhatsApp Messages
koi.ai
The Illustrated Transformer
jalammar.github.io
Ultrasound Cancer Treatment: Sound Waves Fight Tumors
spectrum.ieee.org
GLM-4.7: Advancing the Coding Capability
z.ai
The Garbage Collection Handbook
gchandbook.org
Flock Exposed Its AI-Powered Cameras to the Internet. We Tracked Ourselves
404media.co
NIST was 5 μs off UTC after last week's power cut
jeffgeerling.com
Claude Code gets native LSP support
github.com
Scaling LLMs to Larger Codebases
blog.kierangill.xyz
Things I learnt about passkeys when building passkeybot
enzom.dev
How the RESISTORS put computing into 1960s counter-culture
spectrum.ieee.org
Universal Reasoning Model (53.8% pass 1 ARC1 and 16.0% ARC 2)
arxiv.org
US blocks all offshore wind construction, says reason is classified
arstechnica.com
The biggest CRT ever made: Sony's PVM-4300
dfarq.homeip.net
Show HN: C-compiler to compile TCC for live-bootstrap
github.com
There Is No Future for Online Safety Without Privacy and Security
itsfoss.com
Uplane (YC F25) Is Hiring Founding Engineers (Full-Stack and AI)
useparallel.com
Hybrid Aerial Underwater Drone – Bachelor Project [video]
youtube.com
Ask HN: Why isn't there competition to LinkedIn yet?
Tc – Theodore Calvin's language-agnostic testing framework
github.com
Jimmy Lai Is a Martyr for Freedom
reason.com
Sounds like a further improvement in the spirit of HRM & TRM models.
Decent comment via x: https://x.com/r0ck3t23/status/2002383378566303745
I continue to be fascinated by these architectures that: - Build in recurrence / inference scaling to transformers more natively. - Don't use full recurrent gradient traces, and succeed not just despite, but because of that.