Coding Isn't Programming
177 comments
·March 25, 2025flohofwoe
BinaryPie
You forgot "hacker" :D
jmull
IMO it's a pure waste of time to react to the headline without engaging the content.
raimondious
The headline implies the content is a waste of time to engage with.
nh23423fefe
No it doesn't. The headline is ambiguous and you inserted your own biases without confirming they align with the real world.
tombert
Normally I agree, but this is a Lamport talk. His stuff is pretty much always worth engaging with. I have learned more from his writings than nearly any other human in my life.
MattSayar
For those who think tl;dw here's a PDF of the talk linked on the page
https://www.socallinuxexpo.org/sites/default/files/presentat...
After reading the whole thing I'm not sure how the title describes the presentation.
jsbg
> Coder, programmer, software engineer call yourself whatever you want, the point is to make a computer do what you want it to do.
That is definitely not what software engineering is. It requires translating requirements and ensuring that the code is maintainable in the context of a team and the product's lifecycle.
lxgr
Which is still, in the end, making the computer what you want it to do – in a "do as I mean, not as I say" way.
zamalek
As a job, at least. Programming does not exist exclusively in that realm, hence the demo scene.
qwertox
At least in Germany you can only call yourself a software engineer if you've got an actual engineering degree.
> The professional title "engineer" is legally protected under the German Engineering Act [0].
[0] https://verwaltung.bund.de/leistungsverzeichnis/EN/leistung/...
seanhunter
Oliver Heaviside[1] was rejected when he attempted to join the society of telegraph engineers because they said he was a clerk and not an engineer. Thing is, noone cares about them and his achievements live on.
People protecting titles by putting an arbitrary barrier associated with possessing a piece of paper rather than actually having skill and knowledge should be treated with scorn in my opinion.
[1] “Heaviside step function” and the “Coverup” method for partial fraction expansion when doing integrals are among his discoveries. https://en.wikipedia.org/wiki/Oliver_Heaviside
feoren
Requiring the sheet of paper is less about ensuring the person is qualified, and more about having something you can revoke if they act negligently. It turns out to be important some small percentage of the time that you can say "we will revoke your license if you aren't more careful in the future". And engineers can reject terrible suggestions from non-technical bosses by saying "I'd lose my license if I did that." That's it's main value-add.
I've been to work social events where all the alcohol was provided by the company, but they still had to hire a bartender. You'd pick up a drink, hand it to the bartender, and they'd open it and give it back to you. It sure seems like a stupid, wasteful ceremony; that is, until someone is on their way to getting blackout drunk and the bartender can start refusing to open more drinks for them. They can even use the line "I could lose my bartending license if I keep serving you." The requirement for a licensed bartender was not to make sure someone knew how to open the drinks, it was to make sure someone had the authority and accountability for keeping things under control. (Not to mention making sure everyone was over 21.)
Requiring a stamp from a licensed professional seems pointless up until it prevents a big problem. I'm not opposed to requiring that a Licensed Software Engineer (or whatever) sign off on the software running on chemotherapy machines, as long as that licensing is all done in good faith.
qwertox
Maybe it's just about the ratio of scammers vs. honestly capable people.
You should also have mentioned that: "This riled Heaviside, who asked Thomson to sponsor him, and along with support of the society's president he was admitted 'despite the P.O. snobs'".
mlhpdx
> People protecting titles by putting an arbitrary barrier associated with possessing a piece of paper rather than actually having skill and knowledge should be treated with scorn in my opinion.
What’s your solution then? No attempt at providing professional standards at all?
Systems made by people will always be flawed. That is the reason for and criticism of certification and regulation.
elcritch
While I agree with the sentiments, there is a societal need for gatekeepers in some professions. The engineer title often comes with legal liabilities. It’s not necessarily just about talent. Of course it often becomes misused as well.
lenkite
> People protecting titles by putting an arbitrary barrier associated with possessing a piece of paper rather than actually having skill and knowledge should be treated with scorn in my opinion.
Would you say the same about a medical degree ? How do you judge skill and knowledge in medicine without a professional association ?
teleforce
I think we should stop glorifying Heaviside for his short sighted view on the usage of quaternion for EM field that he considered it as unnecessary "evil" [1]. He then developed a crippled version of the unintuitive vector calculus, a mathematical hack that some people still considered it as the golden standard for doing EM calculations.
IMHO by doing this and going against quaternion he has hindered much progress in EM for more than century since that's what being taught in textbook, and most people including EM engineers don't care to look what's available beyond the run-of-the-mill textbooks.
There's a very famous saying by Einstein, make it simple but not simpler, and in this case Heaviside's "Vectors Versus Quaternions" paper and related advocacy has caused more harm than good to the math, science and engineering of EM based system by doing simpler than simpler (pardon the pun).
I have also a hypothesis that perhaps someone can research into, if Einstein is exposed properly to quaternion based EM, and using it he would have solved the general theory of relativity much earlier than he took of 10 years after the special relativity discovery. The quaternion form of relativity has been even presented by L. Silberstein in 1908, just three years after the discovery of special relativity [3].
It is a shame that even today apparently the Wikipedia entry of both special relativity and general relativity do not even mentioned quaternion (zero, nada, zilch) as if it is a taboo to do so, perhaps partly thanks to Heaviside of his seemingly very successful propaganda even after more than century of progress in math, science and engineering.
[1] Vectors Versus Quaternions (1893):
https://www.nature.com/articles/047533c0
[2] General relativity:
https://en.wikipedia.org/wiki/General_relativity
[3] Quaternionic Form of Relativity. L. Silberstein (1912) [PDF]:
https://dougsweetser.github.io/Q/Stuff/pdfs/Silberstein-Rela...
meigwilym
I understand your point, but having an engineering degree is not just the possession of a certificate. It's a piece of paper that testifies that the holder has the skill and knowledge, and has passed exams designed by experts in the field.
The response given to Heaviside does suggest that snobbery was a more likely reason for refusing his membership, but that's just my impression.
imtringued
Nobody cares in Germany, because nobody is calling themselves "Softwareingenieur". Everyone says "Softwareentwickler" aka software developer, or colloquially just "Entwickler"/developer.
As a fun bit of trivia: entwickeln also means to unravel, with ent=un and wickeln = to ravel.
null
belter
The real problem is that coding is more like being a great musician than a traditional engineer.
It's like telling Eric Clapton, Jimi Hendrix, George Benson, or Wes Montgomery that they're not qualified to teach at a music conservatory because they lack formal diplomas. Sure, technically they're not "certified," but put them next to your average conservatory professor and they'll play circles around 99.99% of them.
Same goes for brilliant coders versus formal engineers.
WillAdams
Trying to think of the coding analogues to those folks.
The only ones I can think of are folks who are self-taught, e.g., Bill Gates, but he famously noted that he had read the three (then extant) volumes of Knuth's _The Art of Computer Programming_, sometimes taking an entire day to contemplate a single page/idea/exercise (and doing all the exercises) and then going on to say that anyone who had read all three volumes (and with the unspoken caveat of "also done all the exercises") should send him a Résumé.
My best friend in high school was a self-taught guitarist, incredibly talented, he also had perfect pitch, and a wonderful insight into what a given piece of music was expressing.
All? of the original self-taught programmers were in an academic context, no? Even those who aren't have either read or researched a great deal, or they have laboriously re-created algorithms and coding concepts from first principles.
Maybe this is just another example of how society has not yet mastered the concept of education?
ramonverse
[dead]
haswell
Building a bridge and building a sand castle are two entirely different things.
Each impressive in their own way, but clearly there is a difference both in terms of the skills required and the outcomes enabled.
I think the specific term used matters less than the ability to distinguish between different types of building things.
If you want to build a bridge, you’re not going to hire the sand castle builder if sand castles are all they have built.
9rx
In this case they all refer to building the same thing, but at different steps in the process. Theoretically each step could be carried out by a different person, but in modern practice it would be quite unusual to do so – which is why they can be, and are, used interchangeably. If someone is doing one of them it is almost certain they are doing all of them.
johnisgood
Yes, using a browser is different from writing a program, and there is a difference between "writing (or copy pasting) a script" and [...].
gwbas1c
I recently attended ICWMM, a conference of people who model stormwater for a living.
Many of the people there technically program for a living, because some of their models rely on Python scripts.
They are not software engineers. These aren't programs designed around security issues, or scalability issues, maintainability issues, or even following processes like pull requests.
fuzzfactor
>people who model stormwater for a living.
>They are not software engineers. These aren't programs designed around security issues, or scalability issues, maintainability issues, or even following processes like pull requests.
That's been the obvious problem for a while now.
These are the people whom software engineers need to be cheerfully working for.
Any less-effective hierarchy and they'll just have to continue making progress on their own terms.
gwbas1c
> These are the people whom software engineers need to be cheerfully working for.
Nope: We are equals with very different goals.
I (a software engineer) happen to work on a product, with customers. Modeling stormwater is part of the product. Within my company, the people who do modeling are roughly equal to me.
In the past, I've built products for customers where the products do not provide programming / scripting tools.
> That's been the obvious problem for a while now.
A script that someone runs on their desktop to achieve a narrowly-defined task, such as running a model, does not need to handle the same concerns that I, as a software engineer, need to handle. To be quite honest, it could be awful spaghetti code, but if no one else will ever touch it once the model is complete, there is no need for things like code reviews, ect.
MITSardine
This is very reductive of scientific computing. Maybe that conference is particularly narrowly focused on certain applications but there's plenty of people developing "real" programs (not python scripts) in academia/research labs, and you usually see those in conferences.
Look for instance at multi-purpose solvers like FreeFEM or FEniCS, or the distributed parallelism linalg libraries like PETSc or MUMPS, or more recent projects like OCCA. These are not small projects, and they care a lot about scalability and maintainability, as well as some of them being pretty "CS-y".
marpstar
This was something I didn't understand when choosing between "Computer Science" and "Software Engineering" as a major in college.
At its simplest:
"Engineering Software" is what people who build software products do.
"Science with a Computer" is something that people doing real calculations, simulations, etc do.
I, too, forget that the tools we use to build complex solutions can still be used by people who don't live in a text editor.
crazygringo
Computer science isn't "science with a computer". It's not about calculations and simulations.
Computer science is the science of computation. It's more to do with theory of computation, information theory, data structures, algorithms, cryptography, databases.
"Science with a computer" is experimental physics, chemisty, biology, etc.
hobofan
No!
"Computer Science" != "Science with a Computer"
"Computer science" (in contrast to software engineering) is the science that looks into the properties of computers and thing related to it (algorithms, database systems). E.g. examining those systems on a scientific level. Depending on how abstract that can have more or less direct input on how to engineer systems in their software/hardware implementations.
"Science with a computer" on the other hand are usually (in terms of majors/academic disciplines) taught in the adjecent subfields: Bioinformatics, Chemoinformatics, Computational biology, etc.. Of course basic computing is also on some level seeping into every science subfield.
MITSardine
You might have meant "Scientific computing" or "computational science" rather than "computer science". Alas, everyone will now point out your mistake.
coro_1
I laughed at this. Having a lovely fixed and unmovable perspective on many pursuits is so engineering. Coding, programming, etc, shapes our communications skills and our outlook in interesting ways. Dating truly benefits as well.
null
anonzzzies
I see the keynote next year; 'vibing isn't coding isn't programming isn't ...; a pyramid of shite that sometimes works a bit'. I am happy Dijkstra doesn't have to see this; he was angry in the 80s in my parents living room; I cannot even imagine what he would be with 'vibe coding'.
frou_dh
"Vibe coding" seems like one of those things where there's way more people complaining about the concept than there ever were people actually promoting it in the first place.
anonzzzies
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
ziddoap
Yes, there are results for people saying "vibe coding" on HN. I don't feel like reading the ~500 or so mentions of it to tally which ones are people promoting it vs. which are people complaining about it. Do you have a summary of your results?
The first page seems mostly to be people complaining.
Cthulhu_
It's going to be rage bait at this point.
api
> ... seems like one of those things where there's way more people complaining ... than there ever were people actually promoting it in the first place.
The entire social and political discourse since around 2016.
alabastervlog
> he was angry in the 80s in my parents living room
Yikes, how bad was your parents’ living room?
anonzzzies
My father was a commercial guy (and a good programmer) who wanted to sell development services to companies, Dijkstra didn't really like this idea, especially if it was done in a way that badly educated (...) people would write not proven correct code was distributed to the clients, making the world worse.
eikenberry
> wanted to sell development services to companies
So he wanted to start a contracting service? Contractors are famous for writing piles of untested, spaghetti code that only kind of works but they've finished it by then and the contract is over. Then some poor sap gets stuck trying to maintain it. Probably one of the worse ways businesses acquire custom software.
jonnycat
Looking back, they were both right!
desdenova
That's exactly what ended up happening.
tmpz22
Vibe coding is analogous to programming while high with identical results.
latexr
At least if you’re programming high (or drunk) you get a different kind of fun and the next day you know you have to recheck your work. The one thing that bothers me about LLM coding¹ is that the vast majority of people doing it don’t understand the issues. Those who do check their work and believe everyone else also does are immensely naive and will one day be severely bitten² by bad unreviewed code written by an LLM.
¹ Not calling it “vibe coding” to not go into pedantic definition discussions.
² As will we all.
baq
Vape coding, maybe. Vibe coding react interfaces for backoffice tools is very real, speaking first hand.
anonzzzies
Is it though; what does the term mean; are we talking to our computers, not reading anything and just judging the non-coder endresult? Or are you just using it to help coding. Vibe coding would be the former and while it sometimes works, it doesn't in many cases, including LoB tools if the backoffice is large and complex enough. Unless you have some interesting tooling to help out, in which case it is possible, but that's not really vibe coding at the moment.
ge96
Funny while getting high would make me creative in the past (draw) I could not do anything hard like writing logic, I also used to think my dumb ideas were good so yeah I don't do it anymore
null
smusamashah
What exactly was he angry at?
anonzzzies
People writing software without proper foundations of logic and proofs. Which was by then quite 'common' (now it's normal of course).
miroljub
> a pyramid of shite
Why do some people use such abominations like "shite" or "shoot" instead of just using "shit"? C'mon lads, everyone knows what you want to say, why not just say it?
JimDabell
“Shite” isn’t a minced oath. It’s a different word to “shit” with a different etymology. Same as “feck” is a different word to “fuck”, even if they are used in the same way.
umanwizard
That’s not true at least according to etymonline. Got a source?
ChrisRR
Shite is a perfectly normal use in the UK, especially in Scotland. It has a slightly different effect to shit
bena
I'd imagine he's British. "Shite" seems to be the spelling favored by the British. Much like arse/ass
voidUpdate
We use both, in different circumstances
osigurdson
People can use whatever word they want.
dpassens
Shite isn't a euphemism like shoot, it's a synonym of shit.
anonzzzies
Some people don't live in the US woooow... In the UK and especially certain parts, shite is the actual word to use. Shite is definitely not the same as gosh or shoot. It's (more) vulgar as shit but our own.
I personally also don't like gosh or shoot as we all know what you mean, but this is not that case, look it up.
neuroelectron
Leslie Lamport's SCaLE 22x Keynote: Think, Abstract, Then Code
Lamport argued for a fundamental shift in programming approach, emphasizing thinking and abstraction before coding, applicable to all non-trivial code, not just concurrent systems.
Abstraction First: Define an abstract view (the "what" and "how") of your program before writing code. This high-level design clarifies logic and catches errors early. Focus on ideas, not specific languages.
Algorithms != Programs: Algorithms are abstract concepts; programs are concrete implementations.
Executions as States: Model program executions as sequences of states, where each state's future depends only on its present. This simplifies reasoning, especially for concurrency.
Invariants are Crucial: Identify invariants—properties true for all states of all executions. Understanding these is essential for correctness.
Precise Specifications Matter: Many library programs lack clear specifications, hindering correct usage, particularly with concurrency. precise, language-independent descriptions of functionality are necessary.
Writing is Thinking: Writing forces clarity and exposes sloppy thinking. Thinking improves writing. It's a virtuous cycle.
Learn to Abstract: Abstraction is a core skill, central to mathematics. Programmers need to develop this ability.
AI for Abstraction? The question was raised whether or not AI models could be utilized for abstract thought processes in programming, due to their seeming nature.
The core message: Programming should be a deliberate process of thoughtful design (abstraction) followed by implementation (coding), with a strong emphasis on clear specifications and understanding program behavior through state sequences and invariants. Thinking is always better than not thinking.sevensor
With all due respect to Dr. Lamport, whose work I greatly admire, I take issue with his example of the max function. The problem he sets up is “what to do with an empty sequence.” He proceeds to handwave the integers and suggest that negative infinity is the right value. But in realistic use cases for max, I want to assume that it produces a finite number, and if the inputs are integral, I expect an integer to pop out.
Apologies to Lamport if he addresses this later; this is where I quit watching.
There are basically two ways out of this, that I’m aware of: either return an error, or make the initial value of the reduction explicit in the function signature, i.e.:
max(x, xs)
And not max(xs)
ndriscoll
I had the same thought skimming the PDF, but he does address it. One of his suggestions is to use an error value as the point at infinity. So I suppose the point is the coding looks the same, but you're taking a more principled view of what you're doing for reasoning purposes.
pron
> But in realistic use cases for max, I want to assume that it produces a finite number, and if the inputs are integral, I expect an integer to pop out.
But that's his point. You're talking about a concrete concern about code, whereas he says that first you should think in abstractions. Negative infinity helps you, in this case, to think in abstractions first.
tasuki
The answer is a NonEmptySequence type. The `max` function doesn't make sense in the context of an empty sequence.
MrMcCall
> Thinking is always better than not thinking.
"Truer words were never spoken." --Case from Gibson's Neuromancer
neuroelectron
Key points:
Algorithm vs. Program: An algorithm is a high-level, abstract concept implementable in various programming languages. We should focus on the ideas, not specific languages.
Concurrency Challenges: Concurrent programs are hard due to thread interleaving, leading to numerous possible executions. Debugging is unreliable, and changes can expose hidden bugs.
Abstraction is Key: Find an abstract view of the program, especially for concurrency, describing how threads synchronize. This higher-level thinking is crucial before coding.
All programs: any piece of code that requires thinking before you write code.
What and How: For most programs, write both "what" the program does and "how" it does it. Precise languages (like Lamport's TLA+) are vital for verifying concurrent programs but not essential for all.
Trivial Example: A simple "max element in array" function illustrates how abstraction simplifies even trivial problems. It revealed a bug in the initial "what" (handling empty arrays) and led to a more robust solution using mathematical concepts (minus infinity).
Executions as State Sequences: View program executions as sequences of states, where the next state depends solely on the current one. This simplifies understanding.
Invariants: An invariant is a condition true for all states of all executions. Understanding invariants is crucial for proving program correctness.
Termination: The example program doesn't always terminate, showcasing the need to consider termination conditions.
Real-World TLA+ Usage: Amazon Web Services uses TLA+ to find design flaws before coding, saving time and money. The Rosetta mission's operating system, Virtuoso, used TLA+ for a cleaner architecture and 10x code reduction.
Abstraction Beyond Concurrency: Even when the "what" is imprecise (like a pretty-printer), abstraction helps by providing high-level rules, making debugging easier.
Thinking and Writing: Thinking requires writing. Writing helps you think better, and vice versa.
Learn Abstraction: Abstraction is central to math. Programmers should learn to think abstractly, possibly with guidance from mathematicians.
Why Programs Should Have Bugs: Library programs often lack precise descriptions, making them hard to use correctly. This is especially true for concurrent programs where input/output relations aren't sufficient.
Lamport emphasizes that programming should be about thoughtful design through abstraction before coding. He advocates viewing executions as state sequences, understanding invariants, and using writing to clarify thinking. He also highlights the importance of precise specifications, especially for library programs. He also touches on the issue of whether or not it is better to outsource abstract thinking to AI.umanwizard
Please just stop posting LLM content to HN. If I want to know what an LLM has to say I can ask it myself.
fwip
Could we get a policy that just pasted LLM output like this should be labeled?
rubiquity
I'm enjoying the irony of the comments section being primarily occupied by people who don't get the message while simultaneously being AI maximalists. Leslie Lamport's entire point is that developing abstract reasoning skills leads to better programs. Abstraction in the math and logical sense lets you do away with the all the irrelevant tiny details, just as AI-guided software development is supposed to do.
What's sad, but to be expected with anything involving rigor, is how many people only read the title and developed an averse reaction. The Hacker in Hacker News can be a skilled programmer who can find their way around things. Maybe now it's more of the hack as in "You're a Hack", meaning you're unskilled and produce low quality results.
kccqzy
An undergraduate mathematics professor of mine liked to use the word coding to refer to any act of transforming concepts into a more precise and machine readable form. Not just writing what you want a computer to do in a programming language, but also encoding data. The word "encode" should make this clear: we are turning something into code. Right afterwards, he drew some binary trees on a blackboard and asked us to come up with a coding scheme to turn these differently shaped binary trees into natural numbers. That means defining an injective function from the set of binary trees to the set of natural numbers.
Coding is too ambiguous a word and I don't really use it much.
WillAdams
Interesting takeaway (paraphrased):
>Algorithms are not programs and should not be written in programming languages and can/should be simple, while programs have to be complex due to the need to execute quickly on potentially large datasets.
Specifically discussed in the context of concurrent programs executing on multiple CPUs due to order of execution differing.
Defining programs as:
>Any piece of code that requires thinking before coding
>Any piece of code to be used by someone who doesn't want to read the code
Apparently he has been giving this talk for a while:
https://www.youtube.com/watch?v=uyLy7Fu4FB4
(previous discussions of it here?)
Interesting that the solid example of simplifying finding the smallest item in a set to finding the smallest value equal to or less than that value and starting the search with the value negative infinity was exactly "Define Errors Out of Existence" from John Ousterhout's book:
https://www.goodreads.com/book/show/39996759-a-philosophy-of...
as discussed here previously: https://news.ycombinator.com/item?id=27686818
(but of course, as an old (La)TeX guy, I'm favourably disposed towards anything Leslie (La)mport brings up)
MrMcCall
> starting the search with the value negative infinity
But then there is no differentiation between the result of the empty set and a set containing only negative infinities.
I consider them separate conditions and would therefore make them separate return values.
That's why I would prefer returning a tuple of a bool hasMax and an int maxV, where maxV's int value is only meaningful if hasMax is true, meaning that the set is non-empty.
Another way of doing such would be passing in an int defMax that is returned for the empty set, but that runs into the same problem should the set's actual max value be that defVal.
Anyway, I typed it up differently in my other toplevel comment in this thread.
ndriscoll
Abstractly dealing with a problem where there's already a negative infinity is no problem. e.g. you can just add another negative infinity that's even smaller. That negative infinity can also be your error value; concretely you might do that by writing a generic algorithm that operates on any Ordered type, and then provide an Ordering instance on Result[E,A] where an E is less than all A. `A` could be some extended number type, or another `Result[E,A']`, or whatever thing with its own `Ordering`.
cloogshicer
Lamport argues that we should separate the "What?" from the "How?". I wonder though, for most problems, doesn't the "What" and the "How" of a program somewhat merge?
For example, are performance considerations part of the "What", or of the "How"?
osigurdson
>> separate the "What?" from the "How?"
This is a huge mental error that people make over and over again. There is no absolute "what" and "how", they are just different levels of abstraction - with important inflection points along the way.
The highest level of abstraction is to simply increase happiness. Usually this is accomplished by trying to increase NPV. Let's do that by making and selling an accounting app. This continues to get more and more concrete down to decisions like "should I use a for loop or a while loop"
Clearly from the above example, "increase NPV" is not an important inflection point as it is undifferentiated and not concrete enough to make any difference. Similarly "for loop vs while loop" is likely not an inflection point either. Rather, there are important decisions along the way that make a given objective a success. Sometimes details can make a massive difference.
glass_of_water
I could not agree more. When trying to understand the distinction between "declarative" and "non-declarative" code, I came to the same conclusion: it's all relative.
molf
I think he argues that more thought should go into explicitly designing the behaviour of a software system ("What?"), independently from its implementation ("How?"). Explicitly designed program behaviour is an abstraction (an algorithm or a set of rules) separate from its implementation (code written in a specific language).
Even if a user cannot precisely explain what a program needs to do, programmers should still explicitly design its behaviour. The benefit, he argues, is that having this explicit design enables you to verify whether the implementation is actually correct. Real-world programs without such designs, he jokes, are by definition bug-free, because without a design you can't determine if certain behaviour is intentional or a bug.
Although I have no experience with TLA+ (which he designed for this purpose in the context of concurrency), this advice does ring true. I have mentored countless programmers, and I've observed that many (junior) programmers see program behaviour as an unintentional consequence of their code, rather than as deliberate choices made before or during programming. They often do not worry about "corner cases", accepting whichever behaviour emerges from their implementation.
Lamport says: no, all behaviour must be intentional. Furthermore, he argues that if you properly design the intended program behaviour, your implementation becomes much simpler. I fully agree!
emporas
Separating the "What" from the "How" is a pretty old idea originating from Prolog, and yes you can do that. Erlang for example, in which many of Lamport's ideas are implemented, is a variation of Prolog.
Prolog by itself haven't found any real world applications as of today, because by ignoring the "How", performance suffers a lot, even though the algorithm might be correct.
That's the reason algorithms are implemented procedurally, and then some invariants of the program are proved on top of the procedural algorithm using a theorem prover like TLA+.
But in practice we implement procedural algorithms and we don't prove any property on top of them. It is not like the average programmer writes code for spaceships. The spaceship market is not significant enough to warrant that much additional effort.
stonemetal12
Isn't that more or less what all abstraction is about, getting an interface that allows you to work with the "what" but not the "how". When I call file open, I get the "what", a file to work with, but not the "how", DMA access, Spinning rust, SSDs, could be an NFS in there who knows.
Yes, the "hows" have impacts on performance. When implementing or selecting the "how" those performance criteria flow down into "how" requirements. As far as I am concerned that is no difference from correctness requirements placed on the "how".
Coming from the other direction. If I am hard disk manufacturer I don't care about the "what". I only care about the "how" of getting storage and the disk IO interface implemented so that my disk is usable by the file system abstraction that the "what" cares about. I may not know the exact performance criteria for your "what", but more performance equals more "whats" satisfied and willing to use my how.
MrMcCall
"not 2, not 1" is the old Zen abstraction for deeply interrelated, but separate concepts.
The What puts constraints on the How, while the How puts constraints on the What.
I would say the the What is the spacial definition of the algorithm's potential data set, while the How is the temporal definition of the algorithm's potential execution path.
rpmisms
Not to be meta, but this talk and this discussion are insanely pedantic.
cratermoon
As I look around at the state of software development in the tech industry, I see a desperate need for more pedantry, not less. As a whole the industry seems bent on selling garbage that barely works, then gradually making it worse in pursuit of growth-at-all-costs.
When nobody cares if their systems work in any meaningful way, of course they'll dismiss anything that hints are rigor and professionalism as pedantry.
rpmisms
I'm not saying pedantry is bad per se, but I think it can be taken too far, which is happening here.
neuroelectron
Yes, I had to pump it through Gemini 3 times before I could really get what he was talking about. He spends the first 3 and half minutes just starting to frame the subject, which I gleaned from reading the transcript. I wonder if this is just his academic background or a long career of billing by the hour. Probably both.
MrMcCall
Maybe my never using LLMs means I can still understand a talk, even one as poorly recorded and delivered as this one was.
> Probably both.
Maybe neither. Maybe PEBCAK.
taeric
There is a good article in the current ACM about how we don't agree on what abstractions are; and in spite of that, they are very useful. Point being that we largely agree where the important points are. We don't really agree on exactly what they are, or why they are important.
In that vein, I think people will often talk about the same general topics with massive agreement that they are the important ones. All the while not actually agreeing on any real points.
You will find lots of inspiration. Which, I think, has its own value.
charliereese
My favourite difference is that coding has the letters c and d (which in my opinion are inferior) whereas programming has p, r, g, a, and m (which in my opinion are underrated). I dislike that they both have o, i, n, and g. I wish this distinction was touched on as well.
rustybolt
Hacking isn't coding isn't programming isn't software development isn't software engineering. But in the end many people use these terms mostly interchangeably and making a point of the differences between the definitions you personally use is rarely a productive use of time.
indymike
I'm pretty sure I can't program without coding, and likewise, I'm pretty sure I can't code without programming. This is a lot like guitarists arguing about "playing" vs "noodling".
wavemode
I think the distinction is evident in the words themselves. Coding creates code, programming creates programs.
If you write code but it doesn't create a program (i.e. you are just recording data) then you're coding but not programming.
Likewise, if you create a program without writing any code (for example, the way a pilot programs routes into his nav computer, or the way you can program a modern thermostat to have specific temperatures at specific times of the day) you're programming but not coding.
Bah that old thing again. FWIW the demo scene 'coders' call themselves that with pride, and usually those are often also brilliant 'programmers' and 'software engineers'.
Coder, programmer, software engineer call yourself whatever you want, the point is to make a computer do what you want it to do.