Hashcards: A Plain-Text Spaced Repetition System
borretti.me
The Typeframe PX-88 Portable Computing System
typeframe.net
Ask HN: What Are You Working On? (December 2025)
Do Dyslexia Fonts Actually Work? (2022)
edutopia.org
Developing a food-safe finish for my wooden spoons
alinpanaitiu.com
AI and the ironies of automation – Part 2
ufried.com
In the Beginning Was the Command Line (1999)
web.stanford.edu
Shai-Hulud compromised a dev machine and raided GitHub org access: a post-mortem
trigger.dev
GraphQL: The enterprise honeymoon is over
johnjames.blog
Price of a bot army revealed across online platforms
cam.ac.uk
Standalone Meshtastic Command Center – One HTML File Offline
github.com
Illuminating the processor core with LLVM-mca
abseil.io
Linux Sandboxes and Fil-C
fil-c.org
Stop crawling my HTML – use the API
shkspr.mobi
Vacuum Is a Lie: About Your Indexes
boringsql.com
Apple Maps claims it's 29,905 miles away
mathstodon.xyz
Compiler Engineering in Practice
chisophugis.github.io
iOS 26.2 fixes 20 security vulnerabilities, 2 actively exploited
macrumors.com
More atmospheric rivers coming for flooded Washington and the West Coast
cnn.com
Kimi K2 1T model runs on 2 512GB M3 Ultras
twitter.com
Efficient Basic Coding for the ZX Spectrum
blog.jafma.net
Getting into Public Speaking
james.brooks.page
I fed 24 years of my blog posts to a Markov model
susam.net
Show HN: Cargo-rail: graph-aware monorepo tooling for Rust; 11 deps
github.com
Hey. Author here.
I wrote a longer post about the motivation and design: https://dev.to/loadingalias/cargo-rail-making-rust-monorepos...
The Problem:
I've been working on a low-level Rust workspace for a while now. Before I knew it, my 'justfile' was over 1k lines and I had 30 shell scripts for testing. My dep graph was WAY too large. I couldn't easily split a single crates, or a few crates to release as OSS repos... I'd have had to use Google's Copybara (Java tooling or their GHA) or a mountain of 'git subtree' and filter scripts.
The Solution:
- Dependency Unification: I use Cargo's resolver output (not syntax parsing) to unify versions, compute MSRV, prune dead dependencies and features. Using 'pin_transitives=true' fully replaces cargo-hakari. The graph is lean across all target-triples w/ a single command. - Change Detection: The local/CICD 'affected' command is graph-aware. I only check/test/bench what changed, and 'test' is Nextest native. - Split/Sync: I use a canonical monorepo; then I extract crate/s with full git history into new repos w/ bi-directional sync and 3-way merge conflict resolution. - Release/Publish: Dependency-order publishing, changelog generation, tagging... but in 11 dependencies instead of hundreds.
Key Decisions:
- 11 core deps/55 resolved deps = minimal attack surface for supply-chain attacks - Multi-target resolution = 'cargo metadata --filter-platform' per target in parallel (rayon) means dead dependencies/features are actually dead - System git = your local 'git' binary directly for deterministic SHAs (JJ compatibility is native, obviously). - Lossless TOML = 'toml_edit' preserves comments and manifest formatting
Tested On:
- tikv, meilisearch, helix, helix-db, tokio, ripgrep, polars, ruff, codex, and more. Forks with cargo-rail configured at github.com/loadingalias.
In my own workspace, change detection alone removed 1k LoC and dropped CI costs ~80% and my builds (especially cold) are quicker/leaner.
Happy to discuss the implementation.