The Big OOPs: Anatomy of a Thirty-Five Year Mistake
165 comments
·July 19, 2025DavidPiper
So much gold to mine in this talk. Even just this kind of throwaway line buried deep in the Q&A:
> I prefer to write code in a verb-oriented way not an object-oriented way. ... It also has to do with what type of system you're making: whether people are going to be adding types to the system more frequently or whether they're going to be adding actions. I tend to find that people add actions more frequently.
Suddenly clicked for me why some people/languages prefer doThing(X, Y) vs. X.doThing(Y)
shuaiboi
bjarne (creator of c++) has a quote about this:
Unified function call: The notational distinction between x.f(y) and f(x,y) comes from the flawed OO notion that there always is a single most important object for an operation. I made a mistake adopting that. It was a shallow understanding at the time (but extremely fashionable). Even then, I pointed to sqrt(2) and x+y as examples of problems caused by that view.
https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2019/p19...
ivanjermakov
There is the most important argument regardless, whether it is a method or regular function - first one (or last one in languages supporting currying). While it's true that there are functions with parameters of equal importance, most of them are commutative anyway.
saghm
The main benefit of x.f(y) IMO isn't emphasizing x as something special, but allowing a flat chain of operations rather than nesting them. I think the differences are more obvious if you take things a step further and compare x.f(y).g(z) and g(f(x, y), z). At the end of the day, the difference is just syntax, so the goals should be to aid the programmer in writing correct code and to aid anyone reading the code (including the original programmer at a later point in time!) in understanding the code. There are tradeoffs to using "method" syntax as well, but to me that mostly is an argument for having both options available.
atsbbg
That's exactly the context of where this quote comes from. He wanted to introduce Unified call syntax[1] which would have made both of those equivalent.
But he still has a preference for f(x,y). With x.f(y) gives you have chaining but it also gets rid of multiple dispatch / multimethods that are more natural with f(x,y). Bjarne has been trying to add this back into C++ for quite some time now.
https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/n44...
nickitolas
If my memory isn't failing me, that was part of the reason rust went with a postfix notation for their async keyword ("thing().await") instead of the more common syntax ("await thing()")
robertlagrant
> Suddenly clicked for me why some people/languages prefer doThing(X, Y) vs. X.doThing(Y)
It's when you start writing ThingDoer.doThing(X, Y) that you begin questioning things.
LearnYouALisp
Excuse me, you need a BeanShell.BeanThread.GeneratorStalk.VeggFactory to do that in this Environment®™
morkalork
But if you make it thingService.doThing(x,y) you're all good.
BoiledCabbage
More info on "The Expression Problem" https://en.wikipedia.org/wiki/Expression_problem
pjmlp
There are OOP languages that use doThing(X, Y), though.
Ada, Julia, Dylan, Common Lisp for example.
Yet another example why people shouldn't put programming paradigms all into the same basket.
virgilp
There are a handful of (somewhat exotic) languages that support multiple dispatch - pretty much, all those listed by you. None of the mainstream ones (C++, Java, C# etc) do.
(also Common Lisp is hardly a poster child of OOP, at best you can say it's multi-paradigm like Scala)
pjmlp
I guess Julia and Clojure are exotic.
Since when do OOP languages have to be single paradigm?
By then point of view, people should stop complaining about C++ OOP then.
igouy
Multi-methods do seem like a missed opportunity:
"Visitor Pattern Versus Multimethods"
saghm
Yeah, the dot operator is not a particularly strong signal of whether something is OOP or not. You could change the syntax of method calls in OO languages to not use the object as a prefix without the underlying paradigm being affected.
chrisg23
This is an excellent talk. It digs really deep into the history of OOP, from 1963 to 1998. The point is that in 1998 the commercial game "Thief" was developed using an entity component system (ECS) architecture and not regular OOP. This is the earliest example he knows of in modern commercial programming.
During his research into the history of OOP he discovered that ECS existed as early as 1963, but was largely forgotten and not brought over as a software design concept or methodology when OOP was making its way into new languages and being taught to future programmers.
There's lots of reasons for why this happened, and his long talk is going over the history and the key people and coming up with an explanatory narrative.
inopinatus
You can do ECS in any programming paradigm. It’s not incompatible with OO at all. There’s no need for the object model to be congruent to a static representation of a domain, in a line-of-business app it is much better for it to be congruent to workflow and processing.
Heck I’ve even done ECS in Rails for exactly this reason.
I never accepted the Java/C++ bastardisation of OOP and still think that Erlang is the most OO language, since encapsulation and message passing is so natural.
pjmlp
ECS is a part of OOP, hence why languages like Objective-C introduced protocols, while others like C++ and Eiffel went the multiple inheritance route.
Even Smalltalk, post Smalltalk-80 implementations eventually added traits, alongside its single inheritance model.
zaphar
I am not sure what the link between ECS and protocols, traits, and multiple inheritance is? ECS is mostly about memory layout not typed interfaces as far as I know.
pjmlp
Nope, that is the gist that game devs then apply to ECS story.
If you want to go down the rabbit hole, lets start with the first question, how are ECS systems implemented in any random C++ or Rust game code?
lemonberry
Thank you for this summary. I'm a hobbyist programmer and want to watch this, but having a few concepts to hang my hat helps contextualize this for me.
dsego
No offense, but this reads as a GPT generated summary.
tgv
I had difficulty understanding what irked me about the comment, but indeed, that's it. The mix of superficiality, congeniality, and random details sound like an AI response. However, I don't think it is. But AI surely fucks up our trust.
dsego
Why not, it's a fresh new account, and doesn't bring any new insight, why would anyone genuinely write a comment like that?
chrisg23
None taken. This is the world we live in now.
lisbbb
I complained a lot about OOP all throughout my 25 years a developer. I wrote a ton of C++ and then Java and nobody can refute my expertise with those languages. I saw so many people make mistakes using them, particularly with forcing taxonomies into situations that weren't conducive to having them. Then, when I began complaining to my colleagues about my feelings, I was ostracized and accused of "not having a strong enough skillset." In other words, the dogma of the time overrode the Cassandras saying that the emperor had no clothes. Meanwhile, the simple nature of C and even scripting languages was considered out date. The software dev community finally realized how bad things had gotten with Java and then the walls came a tumbling down. I far prefer writing non-OO Python (or minimal use of classes) to anything else these days. I went all around the language world--did projects involving Lua, Clojure, tons of Groovy, then moved on to Functional Java, Kotlin, and Golang.
hackthemack
Similar experience. I would be the one the entire team would turn to when a really hard problem to debug came up. Yet, when I would say that OOP is not great and is over-complicating the code, I would be scoffed at. I never could reconcile how I was "leaned on" to fix things, but ignored in proposing different paradigms.
I recently read a quote, paraphrasing, Orthodoxy is a poor man's substitute for moral superiority.
gloomyday
This is a pervasive phenomenon in programming. Many times suboptimal solutions remain for a long time because, well, it does solve the problem. Once you are taught X by authoritative figures, you tend to lean on it. It takes experience and an open mind to do anything else.
The use of GOTO is another example. Yes, you probably wouldn't want in your codebase, but the overzealousness against it removes expressions like break statements or multiple return statements from languages.
pjmlp
There is no such thing as non-OO Python, the language is like Smalltalk, everything is an object, even plain numeric values.
This wasn't true with original Python, however since new style classes became the default type system, everything is indeed an object.
So for the anti-OOP folks out there using languages like Python as an example,
Python 3.13.0 (tags/v3.13.0:60403a5, Oct 7 2024, 09:38:07) [MSC v.1941 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> x = 23
>>> type(x)
<class 'int'>
>>> dir(x)
['__abs__', '__add__', '__and__', '__bool__', '__ceil__', '__class__', '__delattr__', '__dir__', '__divmod__', '__doc__', '__eq__', '__float__', '__floor__', '__floordiv__', '__format__', '__ge__', '__getattribute__', '__getnewargs__', '__getstate__', '__gt__', '__hash__', '__index__', '__init__', '__init_subclass__', '__int__', '__invert__', '__le__', '__lshift__', '__lt__', '__mod__', '__mul__', '__ne__', '__neg__', '__new__', '__or__', '__pos__', '__pow__', '__radd__', '__rand__', '__rdivmod__', '__reduce__', '__reduce_ex__', '__repr__', '__rfloordiv__', '__rlshift__', '__rmod__', '__rmul__', '__ror__', '__round__', '__rpow__', '__rrshift__', '__rshift__', '__rsub__', '__rtruediv__', '__rxor__', '__setattr__', '__sizeof__', '__str__', '__sub__', '__subclasshook__', '__truediv__', '__trunc__', '__xor__', 'as_integer_ratio', 'bit_count', 'bit_length', 'conjugate', 'denominator', 'from_bytes', 'imag', 'is_integer', 'numerator', 'real', 'to_bytes']
>>>
treve
This is less about what metaphors are under the hood, but the patterns used on top of them. You can get technical, but it's definitely possible to write primarily functional, imperative or object-oriented code in Python, irrespective of what the syntax is for dealing with primitives.
pjmlp
Not really, because in machinery requires OOP to work.
That apparently non-OOP code, requires bytecodes and runtime capabilities that only exist with OOP semantics on the VM.
It is like arguing one is not driving a steam engine only because they now put gas instead of wood.
constantcrying
The talk makes a very specific complaint. That complaint is not that you are associating data with the functions operating on that data.
What the talk is about compile time (and maybe execution time in the case of python) hierarchies being structured as a mapping of real objects. This is how I was taught OOP and this is what people are recognizing as "OOP".
>So for the anti-OOP folks out there using languages like Python as an example,
Just because a language associates data with functions, does not mean that every program hierarchy has to map onto a real world relationship.
Why are you even commenting on this with your nonsense? Do you really think that if someone is complaining about OOP they are complaining that data types store functions for operating on that data? Has literally anyone ever complained about that?
igouy
> This is how I was taught OOP …
That's unfortunate.
"The simplistic approach is to say that object-oriented development is a process requiring no transformations, beginning with the construction of an object model and progressing seamlessly into object-oriented code. …
While superficially appealing, this approach is seriously flawed. It should be clear to anyone that models of the world are completely different from models of software. The world does not consist of objects sending each other messages, and we would have to be seriously mesmerised by object jargon to believe that it does. …"
"Designing Object Systems", Steve Cook & John Daniels, 1994, page 6
pjmlp
What matters are language implementations and CS definitions, not layman understanding on the street.
xg15
OOP has lots of flaws and is not a good choice in every context, but I still don't understand the universal hatred it seems to get now.
I think OOP techniques made most sense in contexts where data was in memory of long-running processes - think of early versions of MS Office or such.
We've since changed into a computing environment in which everything that is not written to disk should be assumed emepheral: UIs are web-based and may jump not just between threads or processes but between entire machines between two user actions. Processes should be assumed to be killed and restarted at any time, etc etc.
This means it makes a lot less sense today to keep complicated object graphs in memory - the real object graph has to be represented in persistent storage and the logic inside a process works more like a mathematical function, translating back and forth between the front-end representation (HTML, JSON, etc) and the storage representation (flat files, databases, etc). The "business logic" is just a sub-clause in that function.
For that kind of environment, it's obvious why functional or C-style imperative programming would be a better fit. It makes no sense to instantiate a complicated object graph from your input, traverse it once, then destroy it again - and all that again and again for every single user interaction.
But that doesn't mean that the paradigm suddenly has always been bad. It's just that the environment changed.
Also, there may be other contexts in which it still makes sense, such high-level scripting or game programming.
nickitolas
I'm a bit confused. What does any of this have to do with the central thesis of the talk? ("Compile time hierarchies of encapsulation that match the domain model were a mistake")
I understand that OOP is a somewhat diluted term nowdays, meaning different things to different people and in different contexts/communities, but the author spent more than enough time clarifying in excruciating detail what he was talking about.
constantcrying
Did you spend even 3 Minutes trying to understand what the talk was about?
lioeters
The presentation was recently discussed at:
https://news.ycombinator.com/item?id=44596554 [video] (37 comments)
This current posted link is an article by Casey Muratori with supplementary material on topics to explore further.
- Early History of Smalltalk
- History of C++
- Development of the Simula Languages
- Origins of the APT Language for Automatically Programmed Tools
kristianp
And on the other end of the spectrum, you have the proponents of Domain-driven design (DDD)[0], where they use an ML descended language such as F# and the aim is to make invalid states unrepresentable by the program [1]
[0] https://fsharpforfunandprofit.com/ddd/
[1] Make invalid states unrepresentable: https://geeklaunch.io/blog/make-invalid-states-unrepresentab...
zozbot234
How is this "the other end of the spectrum"? The Typestate pattern described at https://geeklaunch.io/blog/make-invalid-states-unrepresentab... (especially wrt. its genericized variety that's quite commonly used in Rust) is precisely a "compile-time hierarchy of encapsulation that matches the domain model", to use Casey Muratori's term for what he's talking about. It's literally inheritance-based OOP in a trenchcoat.
jbreckmckye
There is a good book on DDD in F#, Domain Modelling Made Functional
HexDecOctBin
Is there a similar recommended book using ML/OCaml or some other language of the family? i am hesitant to learn F#, knowing Microsoft's tendencies.
S04dKHzrKT
There are very few F# specific features used in the book. I imagine you could follow along pretty easily with any other functional language. You can easily use F# for the book and then apply the lessons learned to another language when you're done too. It mainly shows how to use sum types, product types and function composition to implement DDD.
I'm not sure what tendencies you're referring to though. F# has been around for 20 years and has only gotten better over time.
clickety_clack
Yes! I just got a copy of this a couple of days ago. Ive been on a DDD + FP kick recently and it’s leading to some really satisfying solutions.
Rochus
Entertaining. The presenter obviously doesn't like the class hierarchy to correspond to the domain model. He seems to think that this was an essential feature of OOP, supported by some quotations by Smalltalk exponents. But not even the Smalltalk world could agree on what OOP actually is (just compare the statements by Kay with the actual architecture of Smalltalk-76ff) and as quickly as Smalltalk lost its significance, there is no need to mention it further. I would rather look at a reputable industry organization such as IEEE, which even publishes its own standards and best practices, what OOP is about. E.g. the OOP Milestone (see https://ethw.org/Milestones:Object-Oriented_Programming,_196...) which names Simula 67 the first OO language, specifies OO as "the combination of three main features: 1) encapsulation of data and code 2) inheritance and late binding 3) dynamic object generation." No mention of the class hierarchy should correspond to the domain model. So maybe we should just not mix up a programming paradigm with how it is used by some folks in practice? The fact that the loudest proponents of a paradigm are not usually those who apply it in practice remains true even today. Takes far less than 2.5 hours to state.
asa400
He literally gives extensive primary source citations to show that the originators of OOP presented this class-domain correspondence as the correct way to think about and do OOP. Bjarne Stroustrup is not just some random guy.
igouy
The source citations are facts. We can check that Alan Kay "The Early History of Smalltalk" shows this on page 82:
"Unfortunately, inheritance — though an incredibly powerful technique — has turned out to be very difficult for novices (and even professionals) to deal with."
When the presenter tells us — 13:45 "he was already saying he kind of soured on it" — that is not a fact, it's speculation. That speculation does not seem to be supported by what follows in "The Early History of Smalltalk".
One page later — "There were a variety of strong desires for a real inheritance mechanism from Adele and me, from Larry Tesler, who was working on desktop publishing, and from the grad students." page 83
And "A word about inheritance. … By the time Smalltalk-76 came along, Dan Ingalls had come up with a scheme that was Simula-like in it's semantics but could be incrementally changed on the fly to be in accord with our goals of close interaction. I was not completely thrilled with it because it seemed that we needed a better theory about inheritance entirely (and still do). … But no comprehensive and clean multiple inheritance scheme appeared that was compelling enough to surmount Dan's original Simula-like design." page 84
Rochus
> He literally gives extensive primary source citations to show that the originators of OOP presented this class-domain correspondence
In case of Dahl/Nygaard it seems logical since their work focus was on simulation. Simula I was mostly a language suited to build discrete-event simulations. Simula 67, which introduced the main features we subsume under "Object-Orientation" today, was conceived as a general-purpose language, but still Dahl and Nygaard mostly used it for building simulations. It would be wrong to conclude that they recommended a class-domain correspondence for the general case.
> Bjarne Stroustrup is not just some random guy
Sure, but he was a Simula user himself for distributed systems simulation during his PhD research at Cambridge University. And he learned Simula during his undergraduate education at Aarhus, where he also took lectures with Nygaard (a simulation guy as well). So also here, not surprising that he used examples with class-domain correspondence. But there was also a slide in the talk where Stroustrup explicitly stated that there are other valid uses of OO than using it for modeling domains.
t420mom
Is it fair to blame all of OOP for C++?
throwawaymaths
yes, because java and c# (and others, python to a certain extent) basically copied it. even ruby, which at its core is about "message passing" sure does a hell of a lot to hide that and make it feel c++ ish. i would bet at least 25% of ruby practitioners arent aware that message passing is happening.
anp
Maybe not fair, but it’s pretty normal for people to assess paradigms based on their most popular implementations.
lisbbb
I don't hate C++ as much as I hate Java, though. That probably has more to do wit the time I was working with C++ and the kinds of projects versus the mind-numbing corporate back-end garbage I worked on with Java.
phendrenad2
It's funny because I read this comment and then watched the video, and it's like the first 15 minutes of the talk are dedicated to debunking this exact comment. Freaky.
dimal
Did you actually watch it? He talks A LOT more about Simula and C++ than Smalltalk. He goes back to the original sources: Kristen Nygaard and Bjarne Stroustrup. Seems odd to focus on Smalltalk when that’s not what the talk was about.
null
MantisShrimp90
i love casey and I love this talk. Always good to see people outside of academia doing deep research and this corroborates allot of how I have understood the subject.
I find it funny that even after he goes into explicit detail about describing oop back to the original sources people either didn't watch it or are just blowing past his research to move the goal post and claim thats not actually what OOP is because they don't want to admit the industry is obsessed with a mistake just like waterfall and are too stockholm syndromed to realize
zaphar
Except that his talk is not anti-OOP. It's anti-a-specific-way of using OOP. Namely representing the Domain Model as the compile time hierarchy. He goes to great lengths that he himself uses OOP concepts in his code. OOP wasn't a mistake per-se. The mainstream way of using as promulgated by a number of experts was the mistake.
mrkeen
The problem is when you take out mistakes, there's not much left of OOP.
We take out 'dog-is-an-animal' inheritance.
We take out object-based delegation of responsibility (an object shall know how to draw itself). A Painter will instead draw many fat structs.
Code reuse? Per the talk, the guy who stumbled onto this was really looking for a List<> use-case, (not a special kind of Bus/LinkedList hybrid. He was after parametric polymorphism, not inheritance.
vkazanov
The problem is that once you exclude domain-specific hierarchy from the discussion, there's not much left of OOP.
It's just data + relevant functions. Which is ok.
That's all there is, really.
mrkeen
And Rich Hickey calls out even that last feature as a mistake, and I tend to agree.
https://gist.github.com/reborg/dc8b0c96c397a56668905e2767fd6...
lproven
Is there a script or transcript anywhere, for those of us who can read 10x faster than it is possible to understand speech?
Jach
I had whisper make a transcript and skimmed some but I ended up watching the talk at ~1.5x speed in the end anyway. https://pastebin.com/EngTq9ZA If you want the timestamps kept in I can paste that too.
em-bee
the video does a deep dive into the history of OOP which is very interesting if you are into that sort of thing.
cma
About every video on YouTube has a transcript, usually a button at the bottom of the description.
lproven
No help. It's video speed.
I can read at several thousand words a minute. So I need the whole transcript in one shot.
Then I can read it in 10 or 15 minutes or so, and decide if it's worth watching a 2 hour plus video. The answer is almost always "no".
cma
Not the closed caption button. In the bottom of the description there is "show transcript" which gives a scrollable transcript.
veggieroll
Use yt-dlp to download the transcript.
zahlman
A few lines of Javascript in the console can copy that to the clipboard for you. Maybe someone's packaged that up already. (It's on my todo list to look around...)
null
kristianp
I imagine the whisper transcript has less errors.
null
tw061023
It seems the community is severely overexposed to bad practices and implementations of OOP and conversely severely underexposed to the success stories.
153 comments as of time of writing, let's see.
C-F: Java: 21 C++: 31 Python: 23 C#: 2
And yet: Pascal: 1 (!) Delphi: 0 VCL: 0 Winforms: 0 Ruby: 2 (in one comment)
This is not a serious conversation about merits of OOP or lack thereof, just like Casey's presentation is not a serious analysis - just a man venting his personal grudges.
I get that, it's completely justified - Java has a culture of horrible overengineering and C++ is, well, C++, the object model is not even the worst part of that mess. But still, it feels like there is a lack of voices of people for whom the concept works well.
People can and will write horrible atrocities in any language with any methodologies; there is at least one widely used "modern C++" ECS implementation built with STL for example (which itself speaks volumes), and there is a vast universe of completely unreadable FP-style TypeScript code out there written by people far too consumed by what they can do to stop for a second and think if they should.
I don't know why Casey chose this particular hill to die on, and I honestly don't care, but we as a community should at least be curious if there are better ways to do our jobs. Sadly, common sense seems to have given way to dogma these days.
rr808
I really dont understand his reasoning. If you have a pointer to the base class different implementations are polymorphic and its hidden from the caller. That is the whole point and it means you can have an engine with a base class in a library, then different people can derive from it and use that engine.
I think his definition of OO is different to what we've got used to. Perhaps his definition needs a different name.
nickitolas
> I think his definition of OO is different to what we've got used to. Perhaps his definition needs a different name.
I've seen "OOP" used to mean different things. For example, sometimes it's said about a language, and sometimes it's unrelated to language features and simply about the "style" or design/architecture/organization of a codebase (Some people say some C codebases are "object oriented", usually because they use either vtables or function pointers, or/and because they use opaque handles).
Even when talking about "OOP as a programming language descriptor", I've seen it used to mean different things. For example, a lot of people say rust is not object-oriented. But rust lets you define data types, and lets you define methods on data types, and has a language feature to let you create a pointer+vtable construct based on what can reasonably be called an interface (A "trait" in rust). The "only" things it's lacking are either ergonomics or inheritance, or possibly a culture of OOP. So one definition of "OOP" could be "A programming language that has inheritance as a language feature". But some people disagree with that, even when using it as a descriptor of programming languages. They might think it's actually about message passing, or encapsulation, or a combination, etc etc.
And when talking about "style"/design, it can also mean different things. In the talk this post is about, the speaker mentions "compile time hierarchies of encapsulation that match the domain model". I've seen teachers in university teach OOP as a way of modelling the "real world", and say that inheritance should be a semantic "is-a" relationship. I think that's the sort of thing the talk is about. But like I mentioned above, some people disagree and think an OOP codebase does not need to be a compile time hierarchy that represents the domain model, it can be used simply as a mechanism for polymorphism or as a way of code reuse.
Anyways, what I mean to say is that I don't think arguing about the specifics of what "OOP" means in the abstract very useful, and that since in this particular piece the author took the time to explicitly call out what they mean that we should probably stick to that.
constantcrying
>I think his definition of OO is different to what we've got used to.
No. His definition is exactly what people are taught OOP is. It is what I was taught, it is what I have seen taught, it is what I see people mean when they say they are doing OOP.
> Perhaps his definition needs a different name.
No. Your definition needs a different name. Polymorphic functions are not OOP. If you give someone standard Julia code, a language entirely built around polymorphic functions, they would tell you that it is a lot of things, except nobody would call it OOP.
Importantly polymorphic functions work without class hierarchies. And calling anything without class hierarchies "OOP" is insane.
igouy
"The expression problem matrix … In object-oriented languages, it's easy to add new types but difficult to add new operations … Whereas in functional languages, it's easy to add new operations but difficult to add new types"
https://eli.thegreenplace.net/2016/the-expression-problem-an...
henning
what is your reasoning? if you make your own object system, that is indeed polymorphic. do you now feel the need to model the world in your application?
nottorp
Any way to find out what the 35 year mistake was without being "engaged" for hours on that video?
chrisg23
Really short: ECS existed in the earliest implementations of OOP in 1963 and was being used in the software he showed.
When OOP went mainstream it pretty much was entirely about "compile time hierarchy of encapsulation that matches the domain model" and nothing else. His opinion is the standard way of doing OOP is a bad match for lots of software problems but became the one-size-fits-all solution as a result of ignorance.
Also he claims that history is being rewritten to some extent to say this wasn't the case and there was never a heavy emphasis on doing things that way.
isotropy
OOPs = "object-oriented programming", BUT it's a more restrained and thoughtful complaint than just "objects suck" or "inheritance sucks". He cabins it pretty clearly at 11:00 minutes in: "compile-time hierarchy of encapsulation that matches the domain model was a mistake"
ocrow
To unpack that a little, he looks to the writings of the early developers of object oriented programming and identifies the ways this assumption became established. People like Bjarne Stroustrup (developer of C++) took on and promulgated the view that the inheritance hierarchy of classes in an object oriented system can be or should be a literal instantiation of the types of objects from the domain model (e.g. different types of shapes in a drawing program).
This is a mistake is because it puts the broad-scale modularization boundaries of a system in the wrong places and makes the system brittle and inflexible. A better approach is one where large scale system boundaries fall along computational capability lines, as exemplified by modern Entity Component Systems. Class hierarchies that rigidly encode domain categorizations don't make for flexible systems.
Some of the earliest writers on object encapsulation, e.g. Tony Hoare, Doug Ross, understood this, but later language creators and promoters missed some of the subtleties of their writings and left us with a poor version of object-oriented programming as the accepted default.
igouy
> This is a mistake is because
mariodiana
Is Objective-C discussed at all?
lisbbb
Inheritance sucks if you wish to write good unit tests easily. It just totally freaking sucks due to encapsulation. When you step back, you realize that composition is a far better approach to writing testable code.
nine_k
In short, in my opinion:
- Encapsulation / interfaces is a good idea, a continuation of the earlier ideas of structured programming.
- Mutable state strewn uncontrollably everywhere is bad idea, even in a single-threaded case.
- Inheritance-based polymorphism is painful, both in the multiple (C++) and single (Java) inheritance cases. Composable interfaces / traits / typeclasses without overriding methods are logically and much more useful.
lisbbb
Over multiple decades, I have come to reject all of it! Even interfaces.
I watched over and over again people writing code to interfaces, particularly due to Spring, and then none of those interface ever got a second implementation done and were never, ever going to! It was a total waste of time, even for testing it was almost a total waste of time, but I guess writing stubbed test classes that could pretend to return data from a queue or a database was somewhat useful. The thing is, there were easier ways to achieve that.
zaphar
Those interfaces that never got a second implementation were still defining the contract for interacting with another part of your system and that compile time enforced contract provides value. I have plenty of complaints about Spring but interfaces is not one of them.
kragen
Basically his “35-year mistake” thesis is that we almost had ECS, the entity/component/system pattern, in 01963 with Sketchpad, but it took until 01998. He explains this near the end of the talk proper, and explains how Looking Glass in 01998 introduced the pattern in Ultima II Underworld, but really introduced it with Tom Leonard’s Thief: The Dark Project. Later though he seems to be saying that he's not sure ECS is actually a good idea, but he thinks encapsulation is, if not a bad idea, at least an idea that should be applied carefully to keep it from getting in your way, and definitely not in a way that reflects a division among problem-domain objects such as cars, trucks, bridges, circular arcs, lanterns, etc.
constantcrying
The 35 year mistake was the idea that in order to have a well structured program, your compile time hierarchies have to represent real world relationships.
The talk traces that mistake to Simula, where it was appropriately used, because it was intended to simulate the real world hierarchies. Then to C++ where it started to become used inappropriately, then to Java, where it became a universal Praxis to model all real world relationship as compile time hierarchies.
nijuashi
Stroustrup took out object hierarchy introspection feature that was available before, which turned out to be a pretty handy feature that people kept trying to reimplement.
pjmlp
Finally coming into C++26, but boy the syntax, C++ keeps competing with Perl on that regard, and I say this as someone that enjoys coding in C++ on my free time.
https://www.youtube.com/watch?v=wo84LFzx5nI