Coding Isn't Programming
215 comments
·March 25, 2025flohofwoe
qwertox
At least in Germany you can only call yourself a software engineer if you've got an actual engineering degree.
> The professional title "engineer" is legally protected under the German Engineering Act [0].
[0] https://verwaltung.bund.de/leistungsverzeichnis/EN/leistung/...
seanhunter
Oliver Heaviside[1] was rejected when he attempted to join the society of telegraph engineers because they said he was a clerk and not an engineer. Thing is, noone cares about them and his achievements live on.
People protecting titles by putting an arbitrary barrier associated with possessing a piece of paper rather than actually having skill and knowledge should be treated with scorn in my opinion.
[1] “Heaviside step function” and the “Coverup” method for partial fraction expansion when doing integrals are among his discoveries. https://en.wikipedia.org/wiki/Oliver_Heaviside
feoren
Requiring the sheet of paper is less about ensuring the person is qualified, and more about having something you can revoke if they act negligently. It turns out to be important some small percentage of the time that you can say "we will revoke your license if you aren't more careful in the future". And engineers can reject terrible suggestions from non-technical bosses by saying "I'd lose my license if I did that." That's it's main value-add.
I've been to work social events where all the alcohol was provided by the company, but they still had to hire a bartender. You'd pick up a drink, hand it to the bartender, and they'd open it and give it back to you. It sure seems like a stupid, wasteful ceremony; that is, until someone is on their way to getting blackout drunk and the bartender can start refusing to open more drinks for them. They can even use the line "I could lose my bartending license if I keep serving you." The requirement for a licensed bartender was not to make sure someone knew how to open the drinks, it was to make sure someone had the authority and accountability for keeping things under control. (Not to mention making sure everyone was over 21.)
Requiring a stamp from a licensed professional seems pointless up until it prevents a big problem. I'm not opposed to requiring that a Licensed Software Engineer (or whatever) sign off on the software running on chemotherapy machines, as long as that licensing is all done in good faith.
qwertox
Maybe it's just about the ratio of scammers vs. honestly capable people.
You should also have mentioned that: "This riled Heaviside, who asked Thomson to sponsor him, and along with support of the society's president he was admitted 'despite the P.O. snobs'".
mlhpdx
> People protecting titles by putting an arbitrary barrier associated with possessing a piece of paper rather than actually having skill and knowledge should be treated with scorn in my opinion.
What’s your solution then? No attempt at providing professional standards at all?
Systems made by people will always be flawed. That is the reason for and criticism of certification and regulation.
elcritch
While I agree with the sentiments, there is a societal need for gatekeepers in some professions. The engineer title often comes with legal liabilities. It’s not necessarily just about talent. Of course it often becomes misused as well.
teleforce
I think we should stop glorifying Heaviside for his short sighted view on the usage of quaternion for EM field that he considered it as unnecessary "evil" [1]. He then developed a crippled version of the unintuitive vector calculus, a mathematical hack that some people still considered it as the golden standard for doing EM calculations.
IMHO by doing this and going against quaternion he has hindered much progress in EM for more than century since that's what being taught in textbook, and most people including EM engineers don't care to look what's available beyond the run-of-the-mill textbooks.
There's a very famous saying by Einstein, make it simple but not simpler, and in this case Heaviside's "Vectors Versus Quaternions" paper and related advocacy has caused more harm than good to the math, science and engineering of EM based system by doing simpler than simpler (pardon the pun).
I have also a hypothesis that perhaps someone can research into, if Einstein is exposed properly to quaternion based EM, and using it he would have solved the general theory of relativity much earlier than he took of 10 years after the special relativity discovery. The quaternion form of relativity has been even presented by L. Silberstein in 1908, just three years after the discovery of special relativity [3].
It is a shame that even today apparently the Wikipedia entry of both special relativity and general relativity do not even mentioned quaternion (zero, nada, zilch) as if it is a taboo to do so, perhaps partly thanks to Heaviside of his seemingly very successful propaganda even after more than century of progress in math, science and engineering.
[1] Vectors Versus Quaternions (1893):
https://www.nature.com/articles/047533c0
[2] General relativity:
https://en.wikipedia.org/wiki/General_relativity
[3] Quaternionic Form of Relativity. L. Silberstein (1912) [PDF]:
https://dougsweetser.github.io/Q/Stuff/pdfs/Silberstein-Rela...
lenkite
> People protecting titles by putting an arbitrary barrier associated with possessing a piece of paper rather than actually having skill and knowledge should be treated with scorn in my opinion.
Would you say the same about a medical degree ? How do you judge skill and knowledge in medicine without a professional association ?
meigwilym
I understand your point, but having an engineering degree is not just the possession of a certificate. It's a piece of paper that testifies that the holder has the skill and knowledge, and has passed exams designed by experts in the field.
The response given to Heaviside does suggest that snobbery was a more likely reason for refusing his membership, but that's just my impression.
imtringued
Nobody cares in Germany, because nobody is calling themselves "Softwareingenieur". Everyone says "Softwareentwickler" aka software developer, or colloquially just "Entwickler"/developer.
As a fun bit of trivia: entwickeln also means to unravel, with ent=un and wickeln = to ravel.
belter
The real problem is that coding is more like being a great musician than a traditional engineer.
It's like telling Eric Clapton, Jimi Hendrix, George Benson, or Wes Montgomery that they're not qualified to teach at a music conservatory because they lack formal diplomas. Sure, technically they're not "certified," but put them next to your average conservatory professor and they'll play circles around 99.99% of them.
Same goes for brilliant coders versus formal engineers.
WillAdams
Trying to think of the coding analogues to those folks.
The only ones I can think of are folks who are self-taught, e.g., Bill Gates, but he famously noted that he had read the three (then extant) volumes of Knuth's _The Art of Computer Programming_, sometimes taking an entire day to contemplate a single page/idea/exercise (and doing all the exercises) and then going on to say that anyone who had read all three volumes (and with the unspoken caveat of "also done all the exercises") should send him a Résumé.
My best friend in high school was a self-taught guitarist, incredibly talented, he also had perfect pitch, and a wonderful insight into what a given piece of music was expressing.
All? of the original self-taught programmers were in an academic context, no? Even those who aren't have either read or researched a great deal, or they have laboriously re-created algorithms and coding concepts from first principles.
Maybe this is just another example of how society has not yet mastered the concept of education?
ramonverse
I didn't study Engineering. I work in Germany and all my software positions said literally "Engineer" in the contract (both in German and English version). Maybe if it is in English it's not illegal who knows.
wink
IANAL but I actually think "Engineer" in Germany is not the same protected word, thus companies use this, or "Software Engineer".
You also don't always get a say what they put in your sig or on your business card (and I think internal names would not matter anyway).
null
jmull
IMO it's a pure waste of time to react to the headline without engaging the content.
raimondious
The headline implies the content is a waste of time to engage with.
tombert
Normally I agree, but this is a Lamport talk. His stuff is pretty much always worth engaging with. I have learned more from his writings than nearly any other human in my life.
nh23423fefe
No it doesn't. The headline is ambiguous and you inserted your own biases without confirming they align with the real world.
MattSayar
For those who think tl;dw here's a PDF of the talk linked on the page
https://www.socallinuxexpo.org/sites/default/files/presentat...
After reading the whole thing I'm not sure how the title describes the presentation.
forgotpwd16
Presentation showcases how coding is just a part of programming, and at a point states it explicitly:
>Programming should be thinking followed by coding.
Maybe if nouns were switched, i.e. Programming Isn't Coding, will've been better.
anotherevan
450 page (slide) PDF. We need a tl;dr on the tl;dw.
jsbg
> Coder, programmer, software engineer call yourself whatever you want, the point is to make a computer do what you want it to do.
That is definitely not what software engineering is. It requires translating requirements and ensuring that the code is maintainable in the context of a team and the product's lifecycle.
lxgr
Which is still, in the end, making the computer what you want it to do – in a "do as I mean, not as I say" way.
bitpush
I understand where you're coming from but a hobby "coder" making computer do little things is also 'translating requirements' and doing all the things a software engineer/code/programmer is doing.
If you still disagree, point me to a place where I can find some coders
zamalek
As a job, at least. Programming does not exist exclusively in that realm, hence the demo scene.
haswell
Building a bridge and building a sand castle are two entirely different things.
Each impressive in their own way, but clearly there is a difference both in terms of the skills required and the outcomes enabled.
I think the specific term used matters less than the ability to distinguish between different types of building things.
If you want to build a bridge, you’re not going to hire the sand castle builder if sand castles are all they have built.
9rx
In this case they all refer to building the same thing, but at different steps in the process. Theoretically each step could be carried out by a different person, but in modern practice it would be quite unusual to do so – which is why they can be, and are, used interchangeably. If someone is doing one of them it is almost certain they are doing all of them.
johnisgood
Yes, using a browser is different from writing a program, and there is a difference between "writing (or copy pasting) a script" and [...].
gwbas1c
I recently attended ICWMM, a conference of people who model stormwater for a living.
Many of the people there technically program for a living, because some of their models rely on Python scripts.
They are not software engineers. These aren't programs designed around security issues, or scalability issues, maintainability issues, or even following processes like pull requests.
fuzzfactor
>people who model stormwater for a living.
>They are not software engineers. These aren't programs designed around security issues, or scalability issues, maintainability issues, or even following processes like pull requests.
That's been the obvious problem for a while now.
These are the people whom software engineers need to be cheerfully working for.
Any less-effective hierarchy and they'll just have to continue making progress on their own terms.
gwbas1c
> These are the people whom software engineers need to be cheerfully working for.
Nope: We are equals with very different goals.
I (a software engineer) happen to work on a product, with customers. Modeling stormwater is part of the product. Within my company, the people who do modeling are roughly equal to me.
In the past, I've built products for customers where the products do not provide programming / scripting tools.
> That's been the obvious problem for a while now.
A script that someone runs on their desktop to achieve a narrowly-defined task, such as running a model, does not need to handle the same concerns that I, as a software engineer, need to handle. To be quite honest, it could be awful spaghetti code, but if no one else will ever touch it once the model is complete, there is no need for things like code reviews, ect.
MITSardine
This is very reductive of scientific computing. Maybe that conference is particularly narrowly focused on certain applications but there's plenty of people developing "real" programs (not python scripts) in academia/research labs, and you usually see those in conferences.
Look for instance at multi-purpose solvers like FreeFEM or FEniCS, or the distributed parallelism linalg libraries like PETSc or MUMPS, or more recent projects like OCCA. These are not small projects, and they care a lot about scalability and maintainability, as well as some of them being pretty "CS-y".
marpstar
This was something I didn't understand when choosing between "Computer Science" and "Software Engineering" as a major in college.
At its simplest:
"Engineering Software" is what people who build software products do.
"Science with a Computer" is something that people doing real calculations, simulations, etc do.
I, too, forget that the tools we use to build complex solutions can still be used by people who don't live in a text editor.
crazygringo
Computer science isn't "science with a computer". It's not about calculations and simulations.
Computer science is the science of computation. It's more to do with theory of computation, information theory, data structures, algorithms, cryptography, databases.
"Science with a computer" is experimental physics, chemisty, biology, etc.
hobofan
No!
"Computer Science" != "Science with a Computer"
"Computer science" (in contrast to software engineering) is the science that looks into the properties of computers and thing related to it (algorithms, database systems). E.g. examining those systems on a scientific level. Depending on how abstract that can have more or less direct input on how to engineer systems in their software/hardware implementations.
"Science with a computer" on the other hand are usually (in terms of majors/academic disciplines) taught in the adjecent subfields: Bioinformatics, Chemoinformatics, Computational biology, etc.. Of course basic computing is also on some level seeping into every science subfield.
MITSardine
You might have meant "Scientific computing" or "computational science" rather than "computer science". Alas, everyone will now point out your mistake.
coro_1
I laughed at this. Having a lovely fixed and unmovable perspective on many pursuits is so engineering. Coding, programming, etc, shapes our communications skills and our outlook in interesting ways. Dating truly benefits as well.
pron
You clearly have not watched the talk. His point isn't that "coders" aren't "programmers", but that coding — i.e. the act of writing code, i.e. text in a programming language, i.e. text that can be interpreted and run by a computer — isn't the same as programming, which is the act of efficiently producing software that does what's expected of it. He attempts to show why only writing code is not the best way to produce software that does what's expected of it. You can agree or disagree, but it has nothing to do with how anyone calls themselves.
crazygringo
Those are idiosyncratic definitions that are not even remotely common or shared. It people misunderstand your message because you chose your words poorly, that's your fault.
It also feels like something of a straw man -- in reality, you have junior programmers/coders and senior programmers/coders and they get better over time.
In real life, they aren't two distinct activities. People write code to get things done, and as they get better they are able to write code that is faster, more elegant, or more scalable. But premature optimization is also the root of all evil, as one saying goes -- plenty of code doesn't need to be faster, more elegant, or more scalable. It just has to work and meet a deadline. And there's nothing wrong with that.
fc417fc802
The point isn't better or faster, at least in and of itself. It's the higher level approach to system design. The balancing of competing goals, including things like less error prone or less effort to maintain.
You speak of programmers getting better over time. The point is to break that improvement down into distinct categories.
Of course they're idiosyncratic definitions. He's attempting to make a distinction which he feels is useful. He needs words to communicate that distinction.
pron
> If people misunderstand your message because you chose your words poorly, that's your fault.
But nobody misunderstood a message, which was clearly communicated in the talk. The response was to a talk title, and titles often just try to be catchy.
> In real life, they aren't two distinct activities. People write code to get things done, and as they get better they are able to write code that is faster, more elegant, or more scalable. But premature optimization is also the root of all evil
It's nothing to do with optimisation or elegance but about the process of writing software that does what it's intended to do, and a way that "gets things done" more easily in some situations. It's fine if you don't want to watch this interesting talk by one of the world's preeminent computer scientists, a Turing-award-winning expert in software correctness, but this discussion about the talk's three-word title is pointless.
AnotherGoodName
I’d argue this whole discussion is a flaw of the English language. Quoting Pratchett “English doesn’t borrow from other languages. English follows other languages down dark alleys, knocks them over and goes through their pockets for loose grammar.”
English has a lot of words from different origins that are more or less synonyms and one of the most common time wasting behaviours i’ve seen is people endlessly trying to categorize those words in endless debate.
The web is full of endless slideshows with titles such as ‘the difference between management and leadership’ or some such. One’s a latin word and the others german for basically the same concept and since english has both (it steals from every other language without care) you’ll find a million slideshows people have created on the differences between the words. This whole thread is yet another example of this behaviour and if you’re aware of it you’ll very quickly tire of every fucking ‘the difference between [english loanword from latin] and [english loanword from german]’ thread you’ll see.
umanwizard
You’re right overall, but I hate that quote.
English doesn’t have a lot of duplicate words because of some aggressive vocabulary-stealing nature. It has a lot of duplicate words for the same reason modern Nahuatl has a lot of loanwords from Spanish: England was colonized and ruled for centuries by non-English-speaking people.
pixl97
“The problem with defending the purity of the English language is that English is about as pure as a cribhouse whore. We don't just borrow words; on occasion, English has pursued other languages down alleyways to beat them unconscious and rifle their pockets for new vocabulary.”
― James D. Nicoll
CydeWeys
These fine-grained attempts at parsing the words to mean things other than how they are commonly used doesn't serve any purpose. Yes, of course the act of physically typing in characters in a programming language is not exactly the same thing about thinking about the algorithms you want the computer to do, but so what? It's trivial and it doesn't matter, and specifically using language to highlight that difference is pointless. To use an analogy, people will often say that "so and so website says X", but the website didn't actually say anything; it can't talk. What they mean is they read so and so text on a website. But we all know what they mean, and it's annoying and pointless to jump in to correct the language there. Similarly, it's annoying and pointless to pedantically argue that "well actually that's not programming, that's coding".
mrkeen
> It's trivial and it doesn't matter
They are different and they absolutely do matter.
DateTime.Now() is a perfectly valid thing to write while coding. Unless you are in a distributed system, where 'now' doesn't exist, so all source code using 'DateTime.Now()' is automatically suspect. How do you know if you're in a distributed system? That's a programming question, not a coding question. And from a lot of the microservice push-back you get here on HN ("just use a single DB instance - then your data is always consistent",) a lot of devs don't realise they're in a distributed system.
"Backtracking", "idempotent", "invariant", "transaction", "ACID", "CAP", "secure", "race condition", "CRDT", are all terms that exist at a programming level, but they don't show up in the source code. A single effectful statement in the source code can be enough to break any of these quoted programming terms.
pron
> These fine-grained attempts at parsing the words to mean things other than how they are commonly used doesn't serve any purpose.
Except we're talking about a talk title. Lamport explains what he means in the talk. What I responded to was a comment on the content based entirely on the title.
> It's trivial and it doesn't matter, and specifically using language to highlight that difference is pointless.
Sure, and that is precisely Lamport's point. You really need to watch the talk. He shows how abstract algorithms cannot be fully described in any language that is intended to be executed by a computer.
> Similarly, it's annoying and pointless to pedantically argue that "well actually that's not programming, that's coding".
And Lamport is not doing that. You're arguing over a pithy title to a rather deep talk.
Workaccount2
>These fine-grained attempts at parsing the words to mean things other than how they are commonly used doesn't serve any purpose.
It's meant to differentiate human "programmers" from AI "coders". Ever since LLM showed up there has been a noticeable urge to redefine the value proposition of software work to be less about what LLMs can do (write code) and more about what humans can do (write programs).
null
anonzzzies
I see the keynote next year; 'vibing isn't coding isn't programming isn't ...; a pyramid of shite that sometimes works a bit'. I am happy Dijkstra doesn't have to see this; he was angry in the 80s in my parents living room; I cannot even imagine what he would be with 'vibe coding'.
frou_dh
"Vibe coding" seems like one of those things where there's way more people complaining about the concept than there ever were people actually promoting it in the first place.
anonzzzies
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...
ziddoap
Yes, there are results for people saying "vibe coding" on HN. I don't feel like reading the ~500 or so mentions of it to tally which ones are people promoting it vs. which are people complaining about it. Do you have a summary of your results?
The first page seems mostly to be people complaining.
api
> ... seems like one of those things where there's way more people complaining ... than there ever were people actually promoting it in the first place.
The entire social and political discourse since around 2016.
Cthulhu_
It's going to be rage bait at this point.
alabastervlog
> he was angry in the 80s in my parents living room
Yikes, how bad was your parents’ living room?
anonzzzies
My father was a commercial guy (and a good programmer) who wanted to sell development services to companies, Dijkstra didn't really like this idea, especially if it was done in a way that badly educated (...) people would write not proven correct code was distributed to the clients, making the world worse.
jonnycat
Looking back, they were both right!
eikenberry
> wanted to sell development services to companies
So he wanted to start a contracting service? Contractors are famous for writing piles of untested, spaghetti code that only kind of works but they've finished it by then and the contract is over. Then some poor sap gets stuck trying to maintain it. Probably one of the worse ways businesses acquire custom software.
desdenova
That's exactly what ended up happening.
tmpz22
Vibe coding is analogous to programming while high with identical results.
latexr
At least if you’re programming high (or drunk) you get a different kind of fun and the next day you know you have to recheck your work. The one thing that bothers me about LLM coding¹ is that the vast majority of people doing it don’t understand the issues. Those who do check their work and believe everyone else also does are immensely naive and will one day be severely bitten² by bad unreviewed code written by an LLM.
¹ Not calling it “vibe coding” to not go into pedantic definition discussions.
² As will we all.
baq
Vape coding, maybe. Vibe coding react interfaces for backoffice tools is very real, speaking first hand.
anonzzzies
Is it though; what does the term mean; are we talking to our computers, not reading anything and just judging the non-coder endresult? Or are you just using it to help coding. Vibe coding would be the former and while it sometimes works, it doesn't in many cases, including LoB tools if the backoffice is large and complex enough. Unless you have some interesting tooling to help out, in which case it is possible, but that's not really vibe coding at the moment.
ge96
Funny while getting high would make me creative in the past (draw) I could not do anything hard like writing logic, I also used to think my dumb ideas were good so yeah I don't do it anymore
null
smusamashah
What exactly was he angry at?
anonzzzies
People writing software without proper foundations of logic and proofs. Which was by then quite 'common' (now it's normal of course).
miroljub
> a pyramid of shite
Why do some people use such abominations like "shite" or "shoot" instead of just using "shit"? C'mon lads, everyone knows what you want to say, why not just say it?
JimDabell
“Shite” isn’t a minced oath. It’s a different word to “shit” with a different etymology. Same as “feck” is a different word to “fuck”, even if they are used in the same way.
umanwizard
That’s not true at least according to etymonline. Got a source?
bena
I'd imagine he's British. "Shite" seems to be the spelling favored by the British. Much like arse/ass
voidUpdate
We use both, in different circumstances
ChrisRR
Shite is a perfectly normal use in the UK, especially in Scotland. It has a slightly different effect to shit
osigurdson
People can use whatever word they want.
dpassens
Shite isn't a euphemism like shoot, it's a synonym of shit.
anonzzzies
Some people don't live in the US woooow... In the UK and especially certain parts, shite is the actual word to use. Shite is definitely not the same as gosh or shoot. It's (more) vulgar as shit but our own.
I personally also don't like gosh or shoot as we all know what you mean, but this is not that case, look it up.
neuroelectron
Leslie Lamport's SCaLE 22x Keynote: Think, Abstract, Then Code
Lamport argued for a fundamental shift in programming approach, emphasizing thinking and abstraction before coding, applicable to all non-trivial code, not just concurrent systems.
Abstraction First: Define an abstract view (the "what" and "how") of your program before writing code. This high-level design clarifies logic and catches errors early. Focus on ideas, not specific languages.
Algorithms != Programs: Algorithms are abstract concepts; programs are concrete implementations.
Executions as States: Model program executions as sequences of states, where each state's future depends only on its present. This simplifies reasoning, especially for concurrency.
Invariants are Crucial: Identify invariants—properties true for all states of all executions. Understanding these is essential for correctness.
Precise Specifications Matter: Many library programs lack clear specifications, hindering correct usage, particularly with concurrency. precise, language-independent descriptions of functionality are necessary.
Writing is Thinking: Writing forces clarity and exposes sloppy thinking. Thinking improves writing. It's a virtuous cycle.
Learn to Abstract: Abstraction is a core skill, central to mathematics. Programmers need to develop this ability.
AI for Abstraction? The question was raised whether or not AI models could be utilized for abstract thought processes in programming, due to their seeming nature.
The core message: Programming should be a deliberate process of thoughtful design (abstraction) followed by implementation (coding), with a strong emphasis on clear specifications and understanding program behavior through state sequences and invariants. Thinking is always better than not thinking.sevensor
With all due respect to Dr. Lamport, whose work I greatly admire, I take issue with his example of the max function. The problem he sets up is “what to do with an empty sequence.” He proceeds to handwave the integers and suggest that negative infinity is the right value. But in realistic use cases for max, I want to assume that it produces a finite number, and if the inputs are integral, I expect an integer to pop out.
Apologies to Lamport if he addresses this later; this is where I quit watching.
There are basically two ways out of this, that I’m aware of: either return an error, or make the initial value of the reduction explicit in the function signature, i.e.:
max(x, xs)
And not max(xs)
ndriscoll
I had the same thought skimming the PDF, but he does address it. One of his suggestions is to use an error value as the point at infinity. So I suppose the point is the coding looks the same, but you're taking a more principled view of what you're doing for reasoning purposes.
pron
> But in realistic use cases for max, I want to assume that it produces a finite number, and if the inputs are integral, I expect an integer to pop out.
But that's his point. You're talking about a concrete concern about code, whereas he says that first you should think in abstractions. Negative infinity helps you, in this case, to think in abstractions first.
fc417fc802
Are the integers not a reasonable abstraction though?
tasuki
The answer is a NonEmptySequence type. The `max` function doesn't make sense in the context of an empty sequence.
MrMcCall
> Thinking is always better than not thinking.
"Truer words were never spoken." --Case from Gibson's Neuromancer
neuroelectron
Key points:
Algorithm vs. Program: An algorithm is a high-level, abstract concept implementable in various programming languages. We should focus on the ideas, not specific languages.
Concurrency Challenges: Concurrent programs are hard due to thread interleaving, leading to numerous possible executions. Debugging is unreliable, and changes can expose hidden bugs.
Abstraction is Key: Find an abstract view of the program, especially for concurrency, describing how threads synchronize. This higher-level thinking is crucial before coding.
All programs: any piece of code that requires thinking before you write code.
What and How: For most programs, write both "what" the program does and "how" it does it. Precise languages (like Lamport's TLA+) are vital for verifying concurrent programs but not essential for all.
Trivial Example: A simple "max element in array" function illustrates how abstraction simplifies even trivial problems. It revealed a bug in the initial "what" (handling empty arrays) and led to a more robust solution using mathematical concepts (minus infinity).
Executions as State Sequences: View program executions as sequences of states, where the next state depends solely on the current one. This simplifies understanding.
Invariants: An invariant is a condition true for all states of all executions. Understanding invariants is crucial for proving program correctness.
Termination: The example program doesn't always terminate, showcasing the need to consider termination conditions.
Real-World TLA+ Usage: Amazon Web Services uses TLA+ to find design flaws before coding, saving time and money. The Rosetta mission's operating system, Virtuoso, used TLA+ for a cleaner architecture and 10x code reduction.
Abstraction Beyond Concurrency: Even when the "what" is imprecise (like a pretty-printer), abstraction helps by providing high-level rules, making debugging easier.
Thinking and Writing: Thinking requires writing. Writing helps you think better, and vice versa.
Learn Abstraction: Abstraction is central to math. Programmers should learn to think abstractly, possibly with guidance from mathematicians.
Why Programs Should Have Bugs: Library programs often lack precise descriptions, making them hard to use correctly. This is especially true for concurrent programs where input/output relations aren't sufficient.
Lamport emphasizes that programming should be about thoughtful design through abstraction before coding. He advocates viewing executions as state sequences, understanding invariants, and using writing to clarify thinking. He also highlights the importance of precise specifications, especially for library programs. He also touches on the issue of whether or not it is better to outsource abstract thinking to AI.fwip
Could we get a policy that just pasted LLM output like this should be labeled?
umanwizard
Please just stop posting LLM content to HN. If I want to know what an LLM has to say I can ask it myself.
mattmanser
I wanted to like this talk. I started to like this talk. But when he jumps from talking about a problem in clear language to mathematical language, I felt he's gone so wrong.
For me, there's no way "Set x to the smallest number >= to all elements of A" is clear to almost anyone in the world. And I was educated in formal logic at uni, taking an advanced, 3rd year course with like just 5 out of the 20,000+ odd students at a top UK uni. So a tiny percentage of a small percentage of the population have ever got exposed to that sort of logic.
The (massive) mistake he's made is thinking logic and language is clear and concise. That changing the words somehow magically changes the meaning. It's not.
I'm sure it works for him. It would not work for most of us that need to communicate clearly with other people, including programmers who are not aware of the full meaning of that sentence.
It's not how most humans work, think or communicate.
Part of many programmers job is to use language that is understandable. Using a specialist language of maths does not make the requirements any clearer for most people, just for mathematicians. Worse still, the sharp listeners amongst you would have noticed that having completely rewritten his "what" he almost immediately he said "most mathematicians agree that the smallest number...".
Notice the "most". So his rewritten "what" is not even right for all mathematicians.
mrkeen
Perhaps you agree with Lamport even harder than Lamport does.
If your team is going to get together and implement "Set x to the smallest number >= to all elements of A", and you feel that that it's an ambiguous or incomplete description, then it's up to you guys to formalise what is meant - a programming activity - rather than just coding it.
If it is as poorly-specified as you suggest, then someone on your team will think that they have correctly implemented it (tests and all!) but it will give the wrong answer in production.
lokar
He has written and spoken a number of time about the need for more mathematical (which I think would include logic and metalogic) education and thinking in CS
If all programmers/ software engineers had a reasonable level of mathematical formalism (could understand and write proofs), and some amount of formal logic, that would be much better.
I think that is his position
Also, understand this is coming from one of the most prominent living people in the field. People ask him for his opinion about this stuff, so he shares it.
mattmanser
In reality that is a waterfall process disguised as programming advice.
What that is saying is that if only you really specified in a clear technical language the code, you wouldn't need to write much of it. It's like his version of UML.
But it all boils down to what we know doesn't work, a waterfall development process.
But worse still, it's a waterfall process where no-one outside a small clique, those who can speak the special language, can join in the design process.
MITSardine
To be pedantic, the definition is wrong too. His definition is not the max but the supremum:
- max: largest element that is in the set, if it exists
- sup: smallest element larger than all elements in the set
But he doesn't really use the definition he gives anyways, his invariant to prove correctness does not rely on it.
kccqzy
An undergraduate mathematics professor of mine liked to use the word coding to refer to any act of transforming concepts into a more precise and machine readable form. Not just writing what you want a computer to do in a programming language, but also encoding data. The word "encode" should make this clear: we are turning something into code. Right afterwards, he drew some binary trees on a blackboard and asked us to come up with a coding scheme to turn these differently shaped binary trees into natural numbers. That means defining an injective function from the set of binary trees to the set of natural numbers.
Coding is too ambiguous a word and I don't really use it much.
cloogshicer
Lamport argues that we should separate the "What?" from the "How?". I wonder though, for most problems, doesn't the "What" and the "How" of a program somewhat merge?
For example, are performance considerations part of the "What", or of the "How"?
osigurdson
>> separate the "What?" from the "How?"
This is a huge mental error that people make over and over again. There is no absolute "what" and "how", they are just different levels of abstraction - with important inflection points along the way.
The highest level of abstraction is to simply increase happiness. Usually this is accomplished by trying to increase NPV. Let's do that by making and selling an accounting app. This continues to get more and more concrete down to decisions like "should I use a for loop or a while loop"
Clearly from the above example, "increase NPV" is not an important inflection point as it is undifferentiated and not concrete enough to make any difference. Similarly "for loop vs while loop" is likely not an inflection point either. Rather, there are important decisions along the way that make a given objective a success. Sometimes details can make a massive difference.
glass_of_water
I could not agree more. When trying to understand the distinction between "declarative" and "non-declarative" code, I came to the same conclusion: it's all relative.
molf
I think he argues that more thought should go into explicitly designing the behaviour of a software system ("What?"), independently from its implementation ("How?"). Explicitly designed program behaviour is an abstraction (an algorithm or a set of rules) separate from its implementation (code written in a specific language).
Even if a user cannot precisely explain what a program needs to do, programmers should still explicitly design its behaviour. The benefit, he argues, is that having this explicit design enables you to verify whether the implementation is actually correct. Real-world programs without such designs, he jokes, are by definition bug-free, because without a design you can't determine if certain behaviour is intentional or a bug.
Although I have no experience with TLA+ (which he designed for this purpose in the context of concurrency), this advice does ring true. I have mentored countless programmers, and I've observed that many (junior) programmers see program behaviour as an unintentional consequence of their code, rather than as deliberate choices made before or during programming. They often do not worry about "corner cases", accepting whichever behaviour emerges from their implementation.
Lamport says: no, all behaviour must be intentional. Furthermore, he argues that if you properly design the intended program behaviour, your implementation becomes much simpler. I fully agree!
cloogshicer
The problem I have with this is that I haven't seen a precise definition of the difference between behavior and implementation. Another word that people use for behavior is 'specification'.
However, a spec that has been sufficiently formalized (so that it can be executed) is an implementation. Maybe an implementation with certain (desirable or undesirable) characteristics, but still an implementation.
Of course there are informal, incomplete specifications that can't be executed. Those also have value of course, but I'd argue that writing those isn't programming.
stonemetal12
Isn't that more or less what all abstraction is about, getting an interface that allows you to work with the "what" but not the "how". When I call file open, I get the "what", a file to work with, but not the "how", DMA access, Spinning rust, SSDs, could be an NFS in there who knows.
Yes, the "hows" have impacts on performance. When implementing or selecting the "how" those performance criteria flow down into "how" requirements. As far as I am concerned that is no difference from correctness requirements placed on the "how".
Coming from the other direction. If I am hard disk manufacturer I don't care about the "what". I only care about the "how" of getting storage and the disk IO interface implemented so that my disk is usable by the file system abstraction that the "what" cares about. I may not know the exact performance criteria for your "what", but more performance equals more "whats" satisfied and willing to use my how.
emporas
Separating the "What" from the "How" is a pretty old idea originating from Prolog, and yes you can do that. Erlang for example, in which many of Lamport's ideas are implemented, is a variation of Prolog.
Prolog by itself haven't found any real world applications as of today, because by ignoring the "How", performance suffers a lot, even though the algorithm might be correct.
That's the reason algorithms are implemented procedurally, and then some invariants of the program are proved on top of the procedural algorithm using a theorem prover like TLA+.
But in practice we implement procedural algorithms and we don't prove any property on top of them. It is not like the average programmer writes code for spaceships. The spaceship market is not significant enough to warrant that much additional effort.
MrMcCall
"not 2, not 1" is the old Zen abstraction for deeply interrelated, but separate concepts.
The What puts constraints on the How, while the How puts constraints on the What.
I would say the the What is the spacial definition of the algorithm's potential data set, while the How is the temporal definition of the algorithm's potential execution path.
WillAdams
Interesting takeaway (paraphrased):
>Algorithms are not programs and should not be written in programming languages and can/should be simple, while programs have to be complex due to the need to execute quickly on potentially large datasets.
Specifically discussed in the context of concurrent programs executing on multiple CPUs due to order of execution differing.
Defining programs as:
>Any piece of code that requires thinking before coding
>Any piece of code to be used by someone who doesn't want to read the code
Apparently he has been giving this talk for a while:
https://www.youtube.com/watch?v=uyLy7Fu4FB4
(previous discussions of it here?)
Interesting that the solid example of simplifying finding the smallest item in a set to finding the smallest value equal to or less than that value and starting the search with the value negative infinity was exactly "Define Errors Out of Existence" from John Ousterhout's book:
https://www.goodreads.com/book/show/39996759-a-philosophy-of...
as discussed here previously: https://news.ycombinator.com/item?id=27686818
(but of course, as an old (La)TeX guy, I'm favourably disposed towards anything Leslie (La)mport brings up)
MrMcCall
> starting the search with the value negative infinity
But then there is no differentiation between the result of the empty set and a set containing only negative infinities.
I consider them separate conditions and would therefore make them separate return values.
That's why I would prefer returning a tuple of a bool hasMax and an int maxV, where maxV's int value is only meaningful if hasMax is true, meaning that the set is non-empty.
Another way of doing such would be passing in an int defMax that is returned for the empty set, but that runs into the same problem should the set's actual max value be that defVal.
Anyway, I typed it up differently in my other toplevel comment in this thread.
ndriscoll
Abstractly dealing with a problem where there's already a negative infinity is no problem. e.g. you can just add another negative infinity that's even smaller. That negative infinity can also be your error value; concretely you might do that by writing a generic algorithm that operates on any Ordered type, and then provide an Ordering instance on Result[E,A] where an E is less than all A. `A` could be some extended number type, or another `Result[E,A']`, or whatever thing with its own `Ordering`.
aragilar
But why do you care that the set is empty (in the abstract, obviously if there was a reason you'd expect the set to be non-empty, that's a different thing)?
The trick is to think about invariants and state, which if you frame it that way, the empty set case implies there is only a single state.
rubiquity
I'm enjoying the irony of the comments section being primarily occupied by people who don't get the message while simultaneously being AI maximalists. Leslie Lamport's entire point is that developing abstract reasoning skills leads to better programs. Abstraction in the math and logical sense lets you do away with the all the irrelevant tiny details, just as AI-guided software development is supposed to do.
What's sad, but to be expected with anything involving rigor, is how many people only read the title and developed an averse reaction. The Hacker in Hacker News can be a skilled programmer who can find their way around things. Maybe now it's more of the hack as in "You're a Hack", meaning you're unskilled and produce low quality results.
rpmisms
Not to be meta, but this talk and this discussion are insanely pedantic.
cratermoon
As I look around at the state of software development in the tech industry, I see a desperate need for more pedantry, not less. As a whole the industry seems bent on selling garbage that barely works, then gradually making it worse in pursuit of growth-at-all-costs.
When nobody cares if their systems work in any meaningful way, of course they'll dismiss anything that hints are rigor and professionalism as pedantry.
rpmisms
I'm not saying pedantry is bad per se, but I think it can be taken too far, which is happening here.
neuroelectron
Yes, I had to pump it through Gemini 3 times before I could really get what he was talking about. He spends the first 3 and half minutes just starting to frame the subject, which I gleaned from reading the transcript. I wonder if this is just his academic background or a long career of billing by the hour. Probably both.
MrMcCall
Maybe my never using LLMs means I can still understand a talk, even one as poorly recorded and delivered as this one was.
> Probably both.
Maybe neither. Maybe PEBCAK.
neuroelectron
You watched the whole hour?
taeric
There is a good article in the current ACM about how we don't agree on what abstractions are; and in spite of that, they are very useful. Point being that we largely agree where the important points are. We don't really agree on exactly what they are, or why they are important.
In that vein, I think people will often talk about the same general topics with massive agreement that they are the important ones. All the while not actually agreeing on any real points.
You will find lots of inspiration. Which, I think, has its own value.
rustybolt
Hacking isn't coding isn't programming isn't software development isn't software engineering. But in the end many people use these terms mostly interchangeably and making a point of the differences between the definitions you personally use is rarely a productive use of time.
dkarl
Reminds me of: "People think software development is just writing code. But it isn't just writing: it requires planning, experimentation, research, and consideration of style and structure."
Writers everywhere: "Excuse me?"
I'm tired of all these bullshit attempts to establish one word for doing it poorly and a different word for doing it well. It's a dumb idea that teaches nobody anything and does nothing to raise the level of professionalism in the industry. Let's take a common word that people use to describe what they do and declare it to be pejorative; that'll show them! I'm disappointed that somebody with such extraordinary theoretical achievements would lower himself to such empty corporate consultant level rhetorical bullshit.
I'm sure the talk is fine, but right now I feel like I don't need to listen to it.
Bah that old thing again. FWIW the demo scene 'coders' call themselves that with pride, and usually those are often also brilliant 'programmers' and 'software engineers'.
Coder, programmer, software engineer call yourself whatever you want, the point is to make a computer do what you want it to do.