Engineering "home-cooked" software
19 comments
·January 14, 2025smileysteve
VyseofArcadia
> An agile process should build a feature, release that feature, interview users, analyze system behavior, iterate by improving user's goal, adding appropriate scale, iterating by removing unexpected errors or behavior.
I feel like there's this no true Scotsman thing going on with agile. Whenever someone describes their actual experiences with agile, there's always at least one person who speaks up and decries it as as not real agile and what agile should be.
At this point I don't care what agile should be. I just don't want management shoving agile down my throat anymore. I've yet to see it actually improve productivity for any team I've been on. Real agile must be exceedingly rare.
thraxil
> I feel like there's this no true Scotsman thing going on with agile. Whenever someone describes their actual experiences with agile, there's always at least one person who speaks up and decries it as as not real agile and what agile should be.
It's not mysterious or confusing. The original definition is at https://agilemanifesto.org/
jimbokun
In your experience, what has improved productivity?
VyseofArcadia
Weekly standup to check in with devs, leave them alone otherwise. Reach out if a high priority item comes up, but 9 times out of 10 that can be an email.
I've heard that one of the benefits of agile is identifying blockers and encouraging collaboration, but I saw much better results from assuming you've hired intelligent adults with work ethic and letting them reach out and collaborate as needed. Daily standups, sprints, boards, planning, etc. are great in a low-trust environment where you can't be sure people are doing the right things. But if you've hired self-directed people, that stuff just gets in the way.
mschild
In the same vein, I recommend reading this post by Robin Sloan https://www.robinsloan.com/notes/home-cooked-app/
jppope
False dichotomy, and not a great analogy. I see what the author is going for but they could have used to stress test their ideas before clicking "publish."
Just for fun let me pose: Where would a Michelin star chef perform better? Whats the equivalent of fast casual or fine dining? Does the type of cuisine influence the outcomes???
1dom
I thought it was a good analogy, it made sense in the context of the article.
Any analogy looks like a bad analogy if you stretch it beyond the authors intent and make out like that's somehow their fault or problem.
I think it's interesting to try think about including the chef in the analogy, but I don't want to entertain you because I feel like the tone of your post was unnecessarily rude, essentially: "duuuh, did you even read your stuff before pressing post, idiot."
6510
I agree with the fun. The fancy restaurant offers a complete experience but if we look only at the food the Michelin star chef cooks very different things at home. The most lame constraint I hear is that to get the stars the kitchen must be French.
meltyness
It's not like the market isn't aware of this. This kind of motivates middleware more generally, so that the design and functionality of custom services can be paid due care.
bokohut
I have been home cooking for a long while now and a portion of that past cooking has turned into acquired businesses that still operate to this day on the system platforms I architected and wrote. Cooking is a skill that takes great time, effort, and commitment yet since nearly all have no time to stop and appreciate a good meal here we have a compounding societal health issue just as with "fast-food" software we have a rapidly compounding cyber security issue. Everything has a pattern and a cycle so once you spot that pattern be sure to cook something up to ride that next crest.
I love to cook and have fed many yet I also have come to accept that most people never meet the chef who made their meal.
Stay Healthy!
pc86
And just think, to those platforms your code is the "seasoning" that the FA decries so vociferously.
hit8run
I can relate. But home-cooking is also often reinventing the wheel. Core logic should be home-cooked but general aspects are probably best solved using battle proven 3rd party dependencies? What you guys think?
hansvm
Producing that general-purpose API from "battle proven 3rd party dependencies" comes at a cost. IME:
1. The API isn't quite what you need, so you add enough extra logic to integrate it that you might as well have reinvented the wheel instead and had a proper solution, plus you have all the bugs and performance problems associated with bolting on extra code to a domain mismatch issue.
2. The API covers cases you don't care about, opening you up to bugs and performance problems from the project's complexity while not actually saving much time since the cases you care about can be handled more simply.
3. Both (1) and (2) are amplified as you change the project going forward. If you wrote the data structure, you could easily tack on the modifications you need. If not, in the presence of desired modifications you needed to re-implement it anyway (or bolt on even more hacks), and the 3rd party isn't a good starting point because of (2).
For something like cryptography, with all the ways constant-time execution and whatnot can bite me in the ass, I'm happy to use a 3rd-party dependency. For almost everything else I'll home-cook it. Networking starts at io_uring. CPU float-intensive software starts with (depending on the domain) a Tensor type capable of lazy, fused operations [0].
That's just a heuristic I use, and from time to time I'll definitely take a shortcut. More often than not though, I'll be replacing the shortcut in less than a year, and the value from having the feature sooner is only sometimes worth the lost productivity. The driving factor is just that as I learn more it's easier and easier to home-cook those sorts of things, whereas the costs induced by 3rd parties haven't really diminished.
[0] I'm curious what a language designed around cache obliviousness and reducing data dependencies might look like. There are some patterns I've used which compose nicely, but they only solve part of the problem. Even custom vector languages like ISPC require a fair bit of work from the programmer.
jppope
I think the article is a false dichotomy but to answer your question: Most 3rd party dependencies can be accomplished easily if you know how to program (things like crypography or well defined algorithms aside). Frameworks and libraries are more like tooling, though they usually have an incentive to lock you into their view of the world.
If you are a good programmer/ independent thinker you will tend to just write software from first principles with limited tooling. Its leaner and faster to build that way. Its usually more effective software. 3rd party libraries though easy to implement bring their own interfaces/paradigms. They require maintenance/security updates. They are often written by individuals who care little about performance.
There are TONS of exceptions to what I am saying above and tons of great packages that I use frequently, but if the default is to try and solve a problem by installing a package now you have 2 problems.
tartoran
Absolutely but the decision does require some form of experience. For example you'd not want to home cook my own logger and instead rely on a battle tested one. However, if you want to trim/pad strings you could avoid a third party dependency easily.
athenot
A good use for home-cooked software is a UI layer over something clunky, or a glue layer over several systems that are cumbersome to use.
For me, that's the best use of time without re-inventing the wheel—though this is specifically for network apps and for addressing pain points of existing SaaS products.
meiraleal
now with LLMs, I've been recreating just the features I need from 3rd party dependencies and getting to know much more about how it works under the hood
nayuki
I would say that this blog post is half right. The half that's not right is glaring.
> the pyramids have had 100% up-time with no human maintenance
It helps that there is hardly any rain in the desert. Water would foul up the structure in a matter of years.
> dependencies are added like seasoning. Hundreds of packages. Thousands of foreign lines of code make their way onto your software routinely
True, and I think NPM JavaScript exhibits the worst of this behavior.
> Problems are expected and fixed on the fly, somewhat haphazardly: ... push a patch ... code scanning ... dependa-bot ... DevOps
True, and I remember when software releases were treated with much more care and quality because it costs a lot to ship a CD or game cartridge rather than a download.
> More waterfall-y.
That's not a good thing. Having a short feedback cycle from implementation back to design is a big win for software development. Waterfall is the dark ages when we didn't know better.
> This is where minimalist software is built.
Agreed.
> No build process
That doesn't make any sense, unless you're writing machine code directly in hexadecimal.
> no outside dependencies
I agree with minimizing dependencies, but there's no such thing as no dependencies. I dare you to avoid any of these libraries: HTTP (especially HTTP/2 and 3 which are much harder than HTTP/1), TLS/SSL, TCP/IP, hardware drivers, compilers, data codecs like DEFLATE, multimedia codecs like AVC and AAC.
> There's no need to be constantly refactoring things, since everything was designed up to spec beforehand. ... The catch? Writing such a spec costs you over 80% of your engineering time, and you'll have nothing to show for it until day 100.
This is a fallacy. I agree with and like this talk where the speaker Glenn Vanderburg argues that the software is the specification, the construction is done by compilers, and that most analogies to physical engineering are completely wrong. https://www.youtube.com/watch?v=NP9AIUT9nos
> The thing is, most humans are laughably bad at architecting software without actually writing it first.
There's no shame in discovering the software architecture as you go along. If you already knew the architecture beforehand, that means you're very familiar with the problem space already, and you probably should've written a framework to avoid repetitive work. In a sense, software development biases toward novel exploratory work rather than routine work, and that's why it's challenging.
> this all stems from a certain greed software developers have ... Meanwhile other engineering fields are far more humble.
Nonsense; greed is human. In all fields of engineering, the general principle is to do more with less. As the saying goes, any fool can build a bridge, but only an engineer can build a bridge that barely stands. All engineers want more features for less cost, and software is no different. The difference is that in most engineering, there are more physical constraints, more templates to apply, more repetitive work. And because of that, the norms are well-established in traditional engineering. You don't look at a big McMansion and call it "greedy" because it's just the norm. You don't look at a sprawling highway interchange with 4 levels of ramps and call it "greedy" because it's socially acceptable.
> Even your measly human body doesn't need weekly patches
Have you looked at the list of bugs for humans? Allergies, back pain, appendix, aging, various birth defects, cancer, etc. If anything, life is the ultimate example of spaghetti coding and monkey-patching. Look at how vertebrate embryos (human, chicken, fish) all look the same in the first few weeks of life, then they diverge as various body parts and limbs are grown or shrunken.
> Sadly, a good home-cooked meal is hard to find nowadays. Fast food is just too good to beat.
It's weird when people praise home-cooked meals, because I've found restaurants that have great food. Heck, I've been to various traditional sit-down restaurants where they bring you the food in one minute and is faster than standing in line at McDonald's.
> Most software today just feels bloated and trashy, even if the experience is drug-like.
Most software do feel bloated and trashy to me too. I especially find that the more popular a software is, the trashier it is. A few decades ago for example, I found MSN Messenger to be popular but insufferable (big program size, laggy UI, lots of attention-grabbing features) whereas IRC was an underground community and the software was very well-behaved (small, not attention-grabbing).
Overall, I agree with the ideas that simpler software is better, there's too much cargo-culting in industry, piling on complexity and dependencies is bad. But your article wanders all over the place and doesn't hit the right points.
I couldn't disagree with the premise of the sections more on development methodology
> Fast Food > Here, we develop in "agile" sprints. Working software is developed at the fastest pace possible, and all bugs are to be fixed later.
> Home Cooked > Here, things are slower, more thoughtful. More waterfall-y.
While Sprints is a term that sounds like fastest pace possible, that is not what the term means; and a key part about waterfall vs agile is that waterfall IS NOT more thoughtful, but all planned up front.
Both methodologies can create bugs, or deliver features faster than scale can be thought of and deliver features faster than can be tested.
If we remove the quotes from "agile" we actually get slower and thoughtful. A key part of that is measuring (training, interviewing, analyzing). An agile process should build a feature, release that feature, interview users, analyze system behavior, iterate by improving user's goal, adding appropriate scale, iterating by removing unexpected errors or behavior.