Haskelling My Python
72 comments
·April 18, 2025brianberns
abeppu
Huh, there must have been something in the water leading up to this. Also from 1998 is this paper, "Calculus in coinductive form" and neither of these cites the other. https://ieeexplore.ieee.org/document/705675
brianberns
These are indeed very similar. Thanks for the link!
The math is a bit over my head, but this formulation seems more difficult than the one I'm familiar with. For example, x^2 is represented as 0::0::2 instead of 0::0::1 (because 2! = 2) and x^3 is represented as 0::0::0::6 instead of 0::0::0::1 (because 3! = 6). Is there a benefit to that?
barrenko
I was introduced to the notion of power series two weeks ago, and now it's seemingly everywhere...
angra_mainyu
Power series are possibly one of the most powerful tools in analysis.
notpushkin
Absolutely unrelated, but there’s a Haskell-like syntax for Python: https://web.archive.org/web/20241205024857/https://pyos.gith...
f = x -> raise if
x :: int => IndexError x
otherwise => ValueError x
Complete with pipelines, of course: "> {}: {}".format "Author" "stop using stale memes"
|> print
agumonkey
not too far from that there's https://coconut-lang.org/
sizeofchar
Wow, this is so awesome! A shame it didn’t progress.
nexo-v1
I really like this idea too. Generators are one of my favorite parts of Python — super memory efficient, and great for chaining transformations. But in practice, I’ve found they can get hard to reason about, especially when you defer evaluation too much. Debugging gets tricky because you can’t easily inspect intermediate states.
When working with other engineers, I’ve learned to be careful: sometimes it’s better to just materialize things into a list for clarity, even if it’s less “elegant” on paper.
There’s a real balance between cleverness and maintainability here.
pletnes
There’s some stuff in itertools to cut sequences into batches. Could be a useful intermediate step - grab 100 things at a time and write functions that receive and emit lists, rather than generators all the way down.
UltraSane
I often will intentionally store intermediate values in variables rather than just doing a clever one liner in Python specifically because I know it will make debugging easier.
ankitml
It is possible to test the chaining though, if you know your data well. If not, those edge cases in the data quality can throw things off balance very easily.
cartoffal
> The $ operator is nothing but syntactic sugar that allows you to write bar $ foo data instead of having to write bar (foo data). That’s it.
Actually, it's even simpler than that: the $ operator is nothing but a function that applies its left argument to its right one! The full definition is
f $ x = f x
(plus a directive that sets its precedence and association)IshKebab
This kind of thing is emblematic of how little Haskellers care about readability and discoverability. I'm sure it's great for people that are already expert Haskellers but it adds yet another thing to learn (that's really hard to search for!) and the benefit is... you can skip a couple of brackets. Awesome.
gizmo686
Skipping brackets is incredibly useful for readability. Basically every language that relies on brackets has settled on a style convention that makes the redundant with indentation. Admittedly, the big exception to this is function application. But Haskell is much more function oriented, so requiring brackets on function application is much more burdensome than in most languages.
As to searchability, this should be covered in whatever learn Haskell material you are using. And if it isn't, then you can literally just search for it in the Haskell search engine [0].
[0] https://hoogle.haskell.org/?hoogle=%24&scope=set%3Astackage
sabellito
I agree with this sentiment. The one thing I liked about Clojure was the simplicity of the syntax, as long as you kept away from writing macros.
In general, after so many decades programming, I've come to dislike languages with optional periods (a.foo) and optional parenthesis for function calls - little gain to a confusion of precedence, what's a field vs method. Seems that the whole DSL craze of 15 years ago was a mistake after all.
Having said all that, I think haskell is awesome, in the original sense of the word. I became a better programmer after working with it for a bit.
rowanG077
Haskell has it's issues, but this really ain't it. $ is idiomatic and used all over the place and is greatly more readable than stacking up brackets. The discoverability is also great because, like most things in Haskell, you can literally just input it into hoogle: https://hoogle.haskell.org/?hoogle=%24 and the first hit is, of course the definition of it. Which includes a full explanation what it does.
mrkeen
Excessive brackets are an evergreen complaint against lisp.
Here's one that made the front page, same time as your comment: https://news.ycombinator.com/item?id=43753381
yoyohello13
You can type `:doc $` in ghci and it will tell you how the function is defined and give you usage examples.
gymbeaux
We aren’t enthusiastic about having to do that either
pklausler
In Haskell, ($) = `id`, because "id f x" = "(id f) x" = "f x".
mark_l_watson
Well, definitely very cool, but: the Haskell code is delightfully readable while the Python code takes effort for me to read. This is not meant as a criticism, this article is a neat thought experiment.
abirch
I agree that this was a great article.
For me, this isn't intuitive. It works; however, it doesn't scream recursion to me.
def ints():
yield 1
yield from map(lambda x: x + 1, ints())
I preferred def ints():
cnt = 1
while True:
yield cnt
cnt += 1
thyrsus
I'm a python newby, so please correct the following: The first function looks quadratic in time and linear in stack space, while the second function looks linear in time and constant in stack space. Memoizing would convert the first function to linear in time and linear in space (on the heap instead of the stack). For python, wouldn't I always use the second definition? For Haskell, I would use the [1..] syntax which the compiler would turn into constant space linear time machine code.
trealira
Late response, but yes, you are completely right. You wouldn't use either implementation whether you're using Python or Haskell; you'd use what you said, because that's both the most obvious and the most performant method of achieving the goal. It's just a fun exercise to show that the version using a lazy map is equivalent to the obvious thing. Some people find it mind-bending and satisfying in the same way some people might find bit-twiddling hacks cool, or math nerds might find any of these mathematical identities cool.
https://math.stackexchange.com/questions/505367/collection-o...
whalesalad
Generators are one of my favorite features of Python when used in this way. You can assemble complex transformation pipelines that don’t do any actual work and save it all for one single materialization.
louthy
I've never written a line of python in my life, so I'm interested in how this "recursive function" can do anything different on each recursive call if it takes no arguments?
def ints():
yield 1
yield from map(lambda x: x + 1, ints())
Surely it would always yield a stream of `1`s? Seems very weird to my brain. "As simple as that" it is not!lalaithion
The first item yielded from ints() is 1.
For the second item, we grab the first item from ints(), and then apply the map operation, and 1+1 is 2.
For the third item, we grab the second item from ints(), and then apply the map operation, and 1+2 is 3.
louthy
Got it, thanks! The syntax was throwing me a bit there.
lalaithion
It’s a pretty bad system since it takes O(n^2) time to produce n integers but ¯\_(ツ)_/¯. Haskell avoids the extra cost by immutable-self-reference instead of creating a new generator each time.
rowanG077
Sure it yields 1. But then it adds one to each yield form the recursive call. And repeat.
sanderjd
This is certainly neat. This isn't a criticism, but I think more like an expansion on the author's point:
The reason this all works is that generators plus memoization is "just" an implementation of the lazy sequences that Haskell has built in.
nurettin
Excited to read their next blog where they discover functools.partial and return self.
nickpsecurity
I just discovered one from your comment. Thank you!
benrutter
I like this! Tiny question, is the cache at the end any different from the inbuilt functools cache?
curtisszmania
[dead]
BiteCode_dev
Or, you know, use numpy.
shash
Yeah, that’s much more efficient.
But there’s something beautiful in the way that a Taylor expansion or a trigonometric identity emerge from the function definition. Also, it teaches an interesting concept in lazy evaluation.
I mean, why not write straight up assembler? That would be even more efficient…
This idea comes from a functional pearl called "Power Series, Power Serious" [0], which is well worth reading.
I implemented the same thing myself in F#. [1]
[0]: https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&d...
[1]: https://github.com/brianberns/PowerSeries