Improving Recommendation Systems & Search in the Age of LLMs
eugeneyan.com
NixOS and reproducible builds could have detected the xz backdoor
luj.fr
Tencent's 'Hunyuan-T1'–The First Mamba-Powered Ultra-Large Model
llm.hunyuan.tencent.com
Map Features in OpenStreetMap with Computer Vision
blog.mozilla.ai
PyTorch Internals: Ezyang's Blog
blog.ezyang.com
EmptyEpsilon open source spaceship bridge simulator
daid.github.io
Landrun: Sandbox any Linux process using Landlock, no root or containers
github.com
Have we underestimated the total number of people on Earth?
newscientist.com
Mathematical Methods for Physics [pdf]
ma.imperial.ac.uk
Quitting an Intel x86 Hypervisor
halobates.de
300-year-old Polish beech voted Tree of the Year
bbc.co.uk
Through a Glass Lushly: Michalina Janoszanka's Reverse Paintings (Ca. 1920s)
publicdomainreview.org
Differential Geometry: A First Course in Curves and Surfaces [pdf]
math.franklin.uga.edu
Derivatives and Logarithms of 3D Transforms
nosferalatu.com
Show HN: FastOpenAPI – automated docs for many Python frameworks
github.com
Domu Technology Inc. (YC S24) Is Hiring a Vibe Coder
ycombinator.com
Crabtime: Zig’s Comptime in Rust
crates.io
Metabolism Can Shape Cells' Destinies
quantamagazine.org
Show HN: We made an MCP server so Cursor can debug Node.js on its own
npmjs.com
Paul A. M. Dirac, Interview by Friedrich Hund (1982) [video]
youtube.com
Optimizing Brainfuck interpreter in the C preprocessor
github.com
There are a few really nice and freely available ML resources out there. But in order to truly get somewhere, I believe there's no way around accompanying theory with hands-on, practical exercises.
Some textbooks actually do have exercises included, but for self-studying you're sort of at a loss if you face an exercise that you cannot solve on your own. If you're lucky, you can ask a question online on platforms such as stackexchange and keep your fingers crossed that you get a good answer, but this form of interaction is severly inferior to a true face-to-face tutor-student exchange where you can quickly and easily ask follow-up questions or gets hints that lead you to the right answer, rather than actually getting the correct answer presented to you.
Further, in a field like ML, that really combines deep theory with practical implementations, you should really be able to do both theoretical exercises as well as hand-on programming tasks. Again, there's a lot of material out there on the internet, but it's not straight-forward to self-study, especially as a complete beginner. You'll inevitable end up in a situation where it's left to you to wade through various online tutorials on how to use a certain library or how to deal with various error messages. And in a way, that is part of our business in general; however for studying ML (or any other field), this is a distraction from your actual goal. The more time and energy you end up spending on various detours as a means to an end to get your exercises done, the more likely you are to give up.
Many approaches have been offered to educate yourself online, both through the availability of resources like this Concise Machine Learning book, to lecture recordings on youtube or university website, all the way up to proper online universities and classes by Coursera and the likes. I think, however, there is still an opportunity for a true self-studying platform that intelligently integrates programming exercises in a scalable way with a focus on helping you through whenever you get stuck. That latter part especially is IMO still underdeveloped.
GPTs could have a great potential to close this gap, but one thing I don't love of the status-quo is that this space is mostly covered by commercial entities. There isn't as much of a community effort in educational AI apps, and one big factor is the cost of running AIs. In classical open-source territory, one big appeal has been that you can develop software without spending much money at all - basically, all you need is a decent computer to get going. Publishing your library or app is taken care of thanks to Github et al., building a community is a lot more effort but still essentially free (except for your time investment, a cost which shouldn't be neglected). But building on GPTs costs actual money which puts many open-source enthusiasts into a more difficult situation.