Skip to content(if available)orjump to list(if available)

Blacksmithing and Lisp

Blacksmithing and Lisp

80 comments

·April 3, 2025

unoti

> you can work on your problem, or you can customize the language to fit your problem better

There’s a thing I’m whispering to myself constantly as I work on software: “if I had something that would make this easy, what would it look like?”

I do this continuously, whether I’m working in C++ or Python. Although the author was talking about Lisp here, the approach should be applied to any language. Split the problem up into an abstraction that makes it look easy. Then dive in and make that abstraction, and ask yourself again what you’d need to make this level easy, and repeat.

Sometimes it takes a lot of work to make some of those parts look and be easy.

In the end, the whole thing looks easy, and your reward is someone auditing the code and saying that you work on a code base of moderate complexity and they’re not sure if you’re capable enough to do anything that isn’t simple. But that’s the way it is sometimes.

crdrost

Yes! I call this sort of top-down programming "wishful thinking." It is these days much easier to explain to people, because machine learning tools.

“if you can just trust that chat GPT will later fill in whatever stub functions you write, how would you write this program?” — and you can quickly get going, “well, I guess I would have a queue, while the queue is not empty I pull an item from there, look up its responsible party in LDAP, I guess I need to memoize my LDAP queries so let's @cache that LDAP stub, if that party is authorized we just log the access to our S3-document, oh yeah I need an S3-document I am building up... otherwise we log AND we add the following new events to the queue...”

It is not the technique that has most enhanced what I write, which is probably a variant on functional core imperative shell. But it's pretty solid as a way to break that writers block that you face in any new app.

bitwize

"Wishful thinking" is exactly what it's called in SICP. You write the code to solve your problem using the abstraction you want, then you implement the abstraction.

resize2996

I want to hear more about this functional core/imperative shell....

SatvikBeri

It's by Gary Bernhardt: https://www.destroyallsoftware.com/screencasts/catalog/funct...

He also did another talk expanding the concept called Boundaries: https://www.destroyallsoftware.com/talks/boundaries

crdrost

Satvik gave you a fine link in a sibling comment, but I like to add something that I call "shell services" so let me give maybe the smallest example I've got of what it looks like, this is a little Python lambda. First you have a core module, which only holds data structures and deterministic transforms between them:

    core/app_config.py   (data structures to configure services)
    core/events.py       (defines the core Event data structure and such)
    core/grouping.py     (parses rule files for grouping Events to send)
    core/parse_events.py (registers a bunch of parsers for events from different sources)
    core/users.py        (defines the core user data structures)
(there's also an __init__.py to mark it as a module, and so forth). There is some subtlety, for instance events.py contains the logic to turn an event into a slack message string or an email, an AppConfig contains the definition of what groups there are and whether they should send an email or a slack message or both. But everything here is a deterministic transform. So for instance `parse_event` doesn't yet know what User to associate an event with, so `user.py` defines a `UserRef` that might be looked up to figure out more about a user, and there is a distinction between an `EventWithRef` which contains a `list[UserRef]` list of user-refs to try, and an `Event` which contains a User.

Then there's the services/ module, which is for interactions with external systems. These are intentionally as bare as possible:

    services/audit_db.py      (saves events to a DB to dedupe them)
    services/config.py        (reads live config params from an AWS environment)
    services/notification.py  (sends emails and slack messages)
    services/user_lookup.py   (user queries like LDAP to look up UserRefs)
If they need to hold onto a connection, like `user_lookup` holds an LDAP connection and `audit_db` holds a database connecction, then these are classes where __init__ takes some subset of an AppConfig to configure itself. Otherwise like for the email/slack sends in the notification service, these are just functions which take part of the AppConfig as a parameter.

These functions are as simple as possible. There are a couple audit_db functions which perform -gasp- TWO database queries, but it's for a good reason (e.g. lambdas can be running in parallel so I want to atomically UPDATE some rows as "mine" before I SELECT them for processing notifications to send). They take core data structures as inputs and generate core data structures as outputs and usually I've arranged for some core data structure to "perfectly match" what the service produces (Python TypedDict is handy for this in JSON-land).

"Simple" can be defined approximately as "having if statements", you can say that basically all if/then logic should be moved to the functional core. This requires a bit of care because for instance a UserRef contains an enum (a UserRefType) and user_lookup will switch() on this to determine which lookup it should perform, should I ask LDAP about an email address, should I ask it about an Amazon user ID, should I ask this other non-LDAP system. I don't consider that sort of switch statement to be if/then complexity. So the rule of thumb is that the decision of what lookups to do, is made in Core code, and then actually doing one is performed from the UserLookupService.

If you grok type theory, the idea more briefly is, "you shouldn't have if/then/else here, but you CAN have try/catch and you CAN accept a sum type as your argument and handle each case of the sum type slightly differently."

Finally there's the parent structure,

    main.py        (main entrypoint for the lambda)
    migrator.py    (a quick DB migration script)
    ../sql/        (some migrations to run)
    ../test/       (some tests to run)
Here's the deal, main.py is like 100 lines long gluing the core to the services. So if you printed it it's only three pages of reading and then you know "oh, this lambda gets an AppConfig from the config service, initializes some other services with that, does the database migrations, and then after all that setup is done, it proceeds in two phases. In the first ingestion phase it parses its event arguments to EventWithRefs, then looks up the list of user refs to a User and then makes an Event with it: then it labels those events with their groups, it checks those groups for an allowlist and drops some events based on the allowlist, otherwise it inserts those events into the database, skipping duplicates. Once all of that ingestion is done, phase two of reporting starts, it reserves any unreported records in the database, groups them by their groups, and for each group, tells the notification service, "here's a bunch of notifications to send" and for each successful send, we mark all of the events we were processing as reported. Last we purge any records older than our retention policy and we close the database connection." You get the story in broad overview in three pages of readable code. Migrator.py adds about two printed pages more to do database migrations, in its current form it makes its own DB connections from strings so it doesn't depend on core/ or services/, it's kind of an "init container" app except AWS Lambda isn't containerized in that way.

The test folder is maybe the most important part, because based on this decoupling,

- The little pieces of logic that haven't been moved out of main.py yet, can be tested by mocking. This can be reduced arbitrarily much -- in theory there is no reason that a Functional Core Imperative Shell program needs mocks. (Without mocks, the assurance that main.py works is that main.py looks like it works and worked previously and hasn't changed, it's pure glue and high-level architecture. If it does need to change, the assurance that it works is that it was deployed to dev and worked fine there, so the overall architecture should be OK.)

- The DB migrations can be tested locally by spinning up a DB with some example data in it, and running migrations on it.

- The core folder can be tested exhaustively by local unit tests. This is why it's all deterministic transforms between data structures -- that's actually, if you like, what mocking is, it's an attempt to take nondeterministic code and make it deterministic. The functional core, is where all the business logic is, and because it's all deterministic it can all be tested without mocking.

- The services, can be tested pretty well by nonlocal unit "smoke"/"integration" tests, which just connect and verify that "if you send X and parse the response to data structure Y, no exception gets thrown and Y has some properties we expect etc." This doesn't fully test the situations where the external libraries called by services, throw exceptions that aren't caught. So like you can easily test "remote exists" and "remote doesn't exist" but "remote stops existing halfway through" is untested and "remote times out" is tricky.

- The choice to test stuff in services, depends a lot on who has control over it. AuditDBService is always tested against another local DB in a docker container with test data preloaded, because we control schema, we control data, it's just a hotspot for devs to modify. config.py's `def config_from_secrets_manager()` is always run against the AWS Secrets Manager in dev. UserLookupService is always tested against live LDAP because that's on the VPN and we have easy access to it. But like NotificationService, while it probably should get some sort of API token and send to real Slack API, we haven't put in the infrastructure for that and created a test Slack channel or whatever... so it's basically untested (we mock the HTTP requests library, I think?). But it's also something that nobody has basically ever had to change.

Once you see that you can just exhaustively test everything in core/ it becomes really addictive to structure everything this way. "What's the least amount of stuff I can put into this shell service, how can I move all of its decisions to the functional core, oh crap do I need a generic type to hold either a User or list[UserRef] possible lookups?" etc.

null

[deleted]

bch

“The reasonable man adapts himself to the world: the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.”

George Bernard Shaw, Man and Superman

zellyn

I think Casey Muratori calls this "compression". And yeah (see other child comment), he does it in C++ :-)

robocat

C++ is for when complexity cost is worth trading for performance gain. What type of person successfully finds simplicity working in C++?

unoti

> C++ is for when complexity cost is worth trading for performance gain. What type of person successfully finds simplicity working in C++?

> What type of person successfully finds simplicity working in C++?

Be the change you want to see!

Every language has the raw materials available to turn the codebase into an inscrutable complex mess, C++ more than others. But it’s still possible to make it make sense with a principled approach.

pjmlp

Those of us that grew up with the language, always used it instead of C when given the option, thus even if no one can claim expertise, we know it well enough to find it relatively simple to work with.

In the some vein that Python looks simple on the surface, but in reality it is a quite deep language, when people move beyond using it as DSL for C/Fortran libraries, or introduction to programming scenarios.

WillAdams

Creating a Domain Specific Language (DSL) for a given task is a classic approach to solving problems.

Animats

The classic issue of who made the first tongs could be inserted here, with some hammering.

(It's a classic legend. There is an Islamic legend that Allah gave the first pair of tongs to the first blacksmith because you need a pair of tongs to make a pair of tongs. There's a Nordic legend that Thor made the first tongs. In reality, somebody probably used a bent piece of green wood, which didn't last long, but could be easily replaced.)

His piece "Vibe Coding, Final Word"[1] is relevant right now.

[1] https://funcall.blogspot.com/2025/04/vibe-coding-final-word....

PaulRobinson

Whitworth [0] showed that you can make a more precise tool than the one you use to make it. This means you "evolve" towards tongs, or screws, or high-precision calipers, or anything else you want to make, if you use the right process.

[0] https://en.wikipedia.org/wiki/Joseph_Whitworth

WillAdams

Simon Winchester did a book on this:

_The Perfectionists: How Precision Engineers Created the Modern World_

(alternately title _Exactly_)

https://www.goodreads.com/work/editions/56364115-the-perfect...

and for further technical details see:

_Foundations of Mechanical Accuracy_ by Wayne R. Moore

https://mitpress.mit.edu/9780262130806/foundations-of-mechan...

bsder

Please don't recommend "Foundations of Mechanical Accuracy" without also providing a source. The prices people want for it are absurd if it is even available.

pmarreck

That's not remotely "vibe coding" though. Vibe coding would be like using Claude Code or Codeium Windsurf with a recent model. Something that does the code edits for you and optionally lets you code-review them first to approve/deny. Not copy-pasting GPT4o-produced bupkis.

djaouen

Have you considered the possibility that this would have made things worse?

pmarreck

If you don’t actually take the opportunity to review the code, I’ve found that it almost certainly does.

Don’t “vibe code” while you’re sleepy, for example

gsf_emergency

According to "official" legend it was the brothers Brokkr & Eitri who made Mjolnir, though I couldn't find anything about the tongs.

https://en.wikipedia.org/wiki/Brokkr

Re: "funcall's vibe coding findings", it makes sense that human-style lisp (/tongs) would be too nonlinear for LLMs (or gods like Thor) to generate?

Edit: but in line with latter-day retcons it also makes sense that Thor would get credit for something good that Loki did

shirleyquirk

It doesn’t make any sense that you’d need tongs to make tongs; just hold the workpiece. Maybe you cant draw out the reins quite so much on your first one. (Ok im a modern blacksmith that assumes the existence of rolled bar as a source material)

But a hammer! How do you make a hammer without a hammer?

WillAdams

Find a chunk of raw metal (possibly meteoric iron, more likely copper) of a suitable size/shape, find a tree, using a sharp rock, saw off a suitable branch, split it open, insert the metal chunk, using vines or the intestines of a small animal secure it in place --- if desired, allow the tree to grow around the inclusion for a couple of years, then use a sharp rock to saw off the branch at a suitable length.

kazinator

The truth is that when you tap softened tongs around a workpiece into shape, they turn into parentheses. That's what reminds you of Lisp, not the malleability explanation that you invented afterward.

Lisp, Jazz, Aikido and (now) Blacksmithing.

gsf_emergency

More generous & valuable comment from reddit

The distinction between Lisp and the programming languages widely adopted in the industry is a bit like the distinction between artist blacksmiths and fabricators. If blacksmiths have the skills and technique to transform the form of the metal materials they work with. While fabricators essentially rely upon the two operations of cutting and welding. Blacksmiths will use those two operations in their work, but also have the more plastic techniques of splitting, drifting, upsetting, fullering, etc.

https://old.reddit.com/r/lisp/comments/1eu9gd9/comment/likzw...

These additional basic tools are created from essentially the same working material, on the fly, just like the tongs in TFA

eikenberry

This is a great analogy, particularly with one addition. That the two operations vary between fabricators so that, ideally, you have the two operations that work the best for your industry. That is the same difference between Lisp like languages and industrial languages, that the former allows you to build any domain language while the latter are already built domain languages. That is that when using Lisp you work like a sculptor, you build your language by removing expressiveness until you can only express your domain. Industrial languages have already removed the expressiveness and are adopted by people who find it useful for their working domain. The main difference is that the latter are more generalized to a category of domains vs. one particular domain. I think this is one of the key reasons they 'won' over more expressive languages like Lisp. They created a better common ground for related projects to collaborate on and collaboration is more important than domain expressiveness.

asa400

From this comment it follows that for “industrial” software, having less power actually allows for a greater degree of composition at a higher level. Whether “more power” is advantageous is contextually dependent. Having only cutting and welding at your disposal is, as a designer, somewhat freeing.

gsf_emergency

Are you accusing artist blacksmiths of spending too much time thinking about/admiring their ad hoc tools?

Cf Whitehead

Civilization advances by extending the number of important operations which we can perform without thinking of them.

null

[deleted]

pfdietz

This comment reminded me of a Youtube channel I watch. The episode I was just watching had Kurtis making flogging spanners (wrenches intended to be used with a hammer) out of steel plate. Draw the outline, cut with a torch, smooth the edges with a grinder, done.

FredPret

It'd be interesting if we could draw up a family tree of tool fabrication for any object.

The root object would be two rocks brought together in a bang heard 'round the world, then perhaps some sharpened sticks, all the way up to a Colchester lathe somewhere in Victorian England and the machinery that made whatever object we're looking at.

WillAdams

For the machine shop there are the "Gingery" books:

https://gingerybookstore.com/

which is a multi-volume series based on the fact that a lathe is the only tool in a machine shop which can replicate itself, so the first volume has one make an aluminum casting foundry in one's backyard, the second how to use it to make a lathe, then one can use the rough-cast lathe to improve itself (or make a better lathe), and from there make the entirety of a machine shop.

Blacksmithing as noted in the original article is unique in that it is self-sufficient in a way that few other processes are, and downright elemental in what one needs.

Another book which touches on this sort of things is Verne's _The Mysterious Island_ which has a couple of escaped Civil War prisoners making things from essentially first principles:

https://www.gutenberg.org/ebooks/1268

Less on the nose are _Robinson Crusoe_ and _The Swiss Family Robinson_, though those have starter kits in the form of flotsam and jetsam (rather more than that for the latter).

wilsonjholmes

appropedia.org has a lot of instructions on how to build things, I don't thing a "progression tree" exists, but it is a place to start!

hyperbrainer

Any sufficiently complicated piece of code contains an ad-hoc implementation of Lisp.

MortyWaves

I just don’t get the analogy he’s trying to draw here

Jtsummers

Lisp is a tool meant to be molded to the work, not to have the work forced into a broader, less expressive Lisp mold. Tongs are tools that can be molded to fit the work, rather than forcing the work to fit the tongs (or awkwardly, and perhaps disastrously, not fitting the tongs).

That's it, not too complex.

hdkdicnsnjes

Imagine the software industry if lisp was mainstream.

throwawaylsp

There's one thing I've never understood. Lisp is 65 years old. It's older than any mainstream programming language apart from FORTRAN. It has a bevy of vocal fans in every generation. So...why hasn't it gone mainstream? Or at least, why has it failed to remain there?

kazinator

Lisp became incredibly popular at one point. It spawned so much productivity that it created huge programs that used lots of memory and demanded expensive hardware.

This peaked at a time when microcomputers had not reached the right affordability and power parameters.

People who were developing in Lisp turned their eyes to the microcomputer market and the business to be had there, if the stuff would only run. So there was some activity of rewriting Lisp stuff in languages like Bliss and C.

The transition from powerful workstations (where we can count Lisp machines) to microcomputers basically destroyed everything which couldn't make the jump nimbly.

The new crop of programmers who cut their teeth on micros simply had no knowledge or experience with anything that didn't run on micros.

Poof, just like that, a chunk of the computing sphere consisting of new people suddenly had amnesia about Cobol, Fortran, Snobol, PL/I, operating systems like TOPS/20 and VMS and whatnot.

Only Unix pulled through, pretty much --- and that's because Unix started on relatively weak hardware, and was kept small. Unix started getting complicated approximately in step with micro hardware getting more complicated and powerful. E.g. a Unix kernel was around 50 kilobytes in 1980. not a good fit for some Apple II or Commodore Pet, but not far off from the resources the IBM PC would have.

By the time micros were powerful enough to the huge Lisp stuff with 20 megabyte images, we were into the 90s, and starting to be overrun with crap dynamic languages.

Now Lisp people could have buckled down and worked on promoting excellent Lisps for microcomputers. There were a few fledgling efforts like that that were not promoted well.

It seems that what Lisp programmers there were, they were mostly wrapped up working on bigger problems on larger hardware, and ignored microcomputers.

It's very hard to promote anything today that most of your Generation X (now management class) didn't get to play with in the 80s and 90s.

dreamcompiler

Related question: Why is welding pretty mainstream while blacksmithing is a much more niche craft? Blacksmithing is a more overarching skill: After all every blacksmith knows how to weld but relatively few welders can forge effectively.

Possible answers:

1. Blacksmiths enjoy making custom tools for each domain while welders just want to get on with solving their domain problem.

2. Blacksmithing is harder to learn. Welding using modern techniques is easy to learn. (Caveat: Welding well is quite difficult. But learning to weld good enough to repair a broken hitch on your tractor is easy.)

3. Welding can solve a very large chunk of metalwork problems. Not all of them--and not always with elegance--but it gets the job done quickly. Blacksmithing can solve a larger set of metalwork problems with more elegance but it also takes more time and skill.

kazinator

Because with welding, you can build useful frame structures out of tubes and rods.

Would you ride a bike frame forged by a blacksmith? Haha.

convolvatron

you can very reasonably do welding in your garage. aside from a welder (as little as $150 for a barely-usable mig), all you need is an angle grinder to cut and finish the welds. commercially you can get a mid-range mig and a couple more smallish tools and you can start selling custom fencework and mounting brackets and such.

blacksmithing you need a forge, which immediately takes up more space and is somewhat more likely to start a fire. an anvil, and tongs, and hammers. its also a lot more physically demanding, even if you use a power hammer.

your #2 and #3 are pretty key. most importantly most fabrication jobs are much happier to get quick work with reasonable precision using stock shapes. once you start talking about real free-form hot shaping you're immediately going up at least 10x in price/time. welded table base - $500. handcrafted wrought table base - $10,000.

really its that metalwork is mostly functional (fences, stairs, railings, walkways, enclosures, stainless for commercial kitchens, pipefitting, etc). its very difficult to stay in business as a actual craftsman making well-designed objects. architectural metal is probably the easiest in (wall coverings, nice looking railing and stairs, lamps, and other decorative elements). and there its still dominated by fabrication processes (machining and welding of stock shapes), although nicer materials like bronze start to have their place.

edit: you know I left this thinking I was missing something and I realized what it is. welding you make shapes out of like-shapes. like making drawings in figma. I don't think a lot of people have what it takes to learn to be a really good freehand artist. and even if you have the skill, being able to design those kind of organic arbitrary shapes so that they are emotive and attractive is another step up. do you want a piece of art which is a direct expression of the concept held by the artist? or do you want a 3x5' 32" inch high workbench for 1/20 the cost?

jcranmer

Disclaimer: this mostly happened before, or at best shortly after, I was born, so this isn't drawn from personal recollection but rather attempting to synthesize from others' recollections, often from people who have some bias.

One of the major trends in computing in the 80's and 90's is that high-end systems lost out to the growth in capabilities of low-end systems, and this happens in pretty much every level in the computing stack. Several people responded to this trend by writing articles sniffling that their high-end systems lost to mass market garbage, often by focusing on the garbage of the mass market garbage and conveniently avoiding analysis as to why the high-end systems failed to be competitive in the mass market. The wonders of Lisp is one of the major topics of this genre.

Most famously, Lisp was tarred by its association with AI during the concomitant collapse of AI that led to the AI Winter, though it's less often explored why AI failed. In short, it didn't work. But more than just AI at the time, people also felt that the future of programming in general was based around the concept of something like rules-based systems: you have a set of rules that correspond to all of the necessary business logic, and a framework of program logic that's making those rules actually take effect--you can see how a language like Lisp works very well in such a world. But programming doesn't have a clean separation between business logic and program logic in practice, and attempts to make that separation cleaner have largely failed.

So Lisp has a strong competitive advantage in a feature that hasn't proven to actually be compelling (separating business from program logic). Outside of that feature, most of its other features are rather less unique and have seeped into most mainstream programming languages. Functional paradigms, REPLs, smart debuggers, garbage collection--these are all pretty widespread nowadays. Where Lisp had good ideas, they've been extensively borrowed. Where those ideas haven't pulled their weight... they've languished, and most of the people wistfully wishing for a return to Lisp haven't acknowledged that the limitations of these features.

hcarvalhoalves

The language was married to and sold with a hardware architecture that didn’t achieve massive commercial success compared to the other workstations at the time and later microcomputers.

kazinator

And it pretty much had to be married to that because off-the-shelf microcomputers weren't powerful enough for it.

Jtsummers

It very nearly did. Then the AI Winter happened.

https://en.wikipedia.org/wiki/AI_winter

vindarel

But where is it now? If not mainstream, where? Is it not used at all, or only by hobbyists, or also by successful companies, today? If it isn't mainstream, is it important, if not, what's the cursor?

elements to not judge in the void https://github.com/azzamsa/awesome-lisp-companies/ (some are hiring) (that's just the companies we know, nothing official)

https://github.com/CodyReichert/awesome-cl/

stzsch

kazinator

That's just a mind projection piece written by someone with next to no Lisp experience, let alone in a setting with multiple developers.

dreamcompiler

"Lisp is so powerful that problems which are technical issues in other programming languages are social issues in Lisp."

So true. Lisp was designed to give individual programmers tremendous power. That means Lisp programmers sometimes prefer to reinvent solutions to problems rather than learn to use some existing solution. This tendency can be an absolute nightmare on a software engineering team.

Not that using Lisp on a software engineering team cannot be done, but it requires very strong discipline and leadership. The absence of strong discipline and leadership on a Lisp SWE team can lead to enormous amounts of wheel reinvention and technical debt.

Obviously discipline and leadership are necessary for any SWE team but languages like C don't encourage reinvention nearly as much as Lisp does, and Lisp programmers in general tend to be very resistant to the imposed discipline that SWE requires. (I say this as a diehard Lisp programmer, so I'm talking about myself.)

linguae

Programming language adoption is more than just about syntax and semantics; there are other factors. For example, JavaScript is often criticized for its design, yet this hasn’t stopped tens of millions of developers from learning the language, since if you want to do client-side Web programming (the most widely deployed platform in the world), you need to use JavaScript, period. It also helps if a language has/had a major corporate backer at a crucial time in its life. Java has Sun/Oracle, C# has Microsoft, Go has Google, and C and C++ had AT&T (Bell Labs).

Lisp’s most successful commercial period was during the 1980s during an AI boom. Companies such as Symbolics, Texas Instruments, and Xerox sold workstations known as Lisp machines that were architecturally designed for running Lisp programs. They had corporate and institutional customers who were interested in AI applications developed under Lisp, including the United States government. Lisp was also standardized during this time period (Common Lisp). Lisp even caught the attention of Apple; Apple had some interesting Lisp and Lisp-related projects during its “interregnum” period when Steve Jobs was absent, most notably Macintosh Common Lisp, the original Newton OS (before C++ advocates won approval from CEO John Sculley), Dylan, and SK8.

However, the AI Winter of the late 1980s and early 1990s, combined with advances in the Unix workstation market where cheaper Sun and DEC machines were outperforming expensive Lisp machines at Lisp programs, severely hurt Lisp in the marketplace. AI would boom again in the 2010s, but this current AI boom is based not on the symbolic AI that Lisp excelled at, but on machine learning, which relies on numerical computing libraries that have C, C++, and even Fortran implementations and Python wrappers. Apple in the 1990s could have been a leading advocate of Lisp for desktop computing, but Apple was an unfocused beacon of creativity; many interesting projects, but no solid execution for replacing the classic Mac OS with an OS that could fully meet the demands for 1990s and 2000s computing. It took Apple to purchase NeXT to make this happen, and under Steve Jobs’ leadership Apple was a focused beacon of creativity with sharp execution. Of course, we ended up with Smalltalk-inspired Objective-C, not Common Lisp or Dylan, as Apple’s official language before Swift was released after the end of Jobs’ second reign.

Some other factors: 1. Lisp was truly unique in the 60s, 70s, and 80s, but it required expensive hardware to run. It would be hard to conceive of a Lisp running well on a 6502 or an 8086. Something like my NeXT Cube with a 68040 would do a much better job, but those machines cost roughly $6500 in 1989 dollars, out of reach for many developers.

2. By the time hardware capable of running Lisp acceptably became affordable, other languages started offering certain features that used to be unique to Lisp. Wanted garbage collection? In 1995 Java became available. Want object-oriented programming? You didn’t even have to wait until 1995 for that due to C++. Want anonymous functions and map()? Python’s popularity took off in the 2000s. Yes, Lisp still offers features that are not easily found in other languages (such as extensive metaprogramming), but the gap between Lisp and competing popular languages has been narrowing with each successive decade.

zozbot234

> Lisp was truly unique in the 60s, 70s, and 80s, but it required expensive hardware to run. It would be hard to conceive of a Lisp running well on a 6502 or an 8086.

You'd be surprised. https://retrocomputing.stackexchange.com/questions/11192/wha... Of course something like FORTH was perhaps more suited to these smaller machines, but LISP implementations were around. Many users of 6502-based microcomputers were familiar with LOGO, which is just a LISP with different syntax.

throwawaylsp

Thanks, that was very interesting and informative!

owebmaster

Microsoft/Google would push a TypeLisp with Java DX

jimbob45

Heavyweight support for corporate usecases is exactly what Lisp is missing right now. I would love for MS to pump out a Visual Scheme or TypeLisp. It’s the perfect scripting language for embedding in CLR Managed Code. Rather than bringing in something massive like C#.

Alas, I think MS saw the failure of Clojure within the Java ecosystem and foresaw the same if they made a similar effort.

roxolotl

What would TypeLisp or Visual Scheme provide that you can’t get from a repl and a language server integrated into your editor?

At work I write a lot of TypeScript. At how I write a lot of lisp. The lisp is absolutely more ergonomic and extensible.

asa400

Why do you consider Clojure a failure in the Java ecosystem?

hdkdicnsnjes

Ah shit, right, it’s probably for the best lisp isn’t mainstream.

MrMcCall

Parentheses wouldn't be shift-9 and -0.

NikkiA

"If C was mainstream braces wouldn't be shift-[ and shift-], clearly LOGO must be the dominant language"

MrMcCall

I second that motion, good Sir!

attila-lendvai

i very rarely type parens while coding in lisp.

it's a tree. it's just a few operations to transform it as a structure.

gsf_emergency

[flagged]