Skip to content(if available)orjump to list(if available)

Vine: A programming language based on Interaction Nets

lewdwig

It starts off looking like a slightly modified Rust until about half way through where it gets real weird. It’s burying the lede… why not open with what interaction nets are and why you’re building them into a programming language?

Made my head hurt a bit, it’s odd that it doesn’t really attempt to contextualise if the weirdness is for its own sake or some greater practical purpose.

Always fun to find a novel way to think about programming though.

Neywiny

Yeah I don't know enough rust to know what's special here. Presumably the inverse backwards part. I retain that every project (language, application, etc) should start with what it is, what it looks like (code or screenshot as appropriate), and why it matters. This felt very implicit in that.

And, I'm still not sure how that wacky stuff gets compiled down. It looks like syntax sugar as a language.

zozbot234

I think the "backwards in time" aspect can most easily be understood as a consequence of (A -> B) being the same as (~B -> ~A). That is, if you know how to go from A to B, then instead of requesting B it's enough to request A. Though this can be applied more generally to a "logic of resources" which then also requires multiple varieties of the "struct" and "enum" construct, depending on whether the "producer" or "requester" side is making a choice.

Neywiny

I guess I'm just not understanding the minimum subtraction example. They claim it only loops through the array once, and I see that in the code, but I don't see how that's possible. In Python land I can certainly do [x - min(v) for x in v] and maybe that looks like I'm only going once but really I'm not. Maybe I'll fire up the compiler and check the resulting code.

I guess I'm still in the edit window. I installed it. Naturally once of its hundreds of dependencies needed a newer rustc than the 22.04 PPA but whatever, got that sorted. Turns out it doesn't compile down into assembly but if I'd kept reading I would've seen that it goes to Ivy, an intermediate language that's run later. Running the example gives:

> vine run vine/examples/sub_min.vi [1, 0, 4, 6]

Interactions Total 948 Annihilate 491 Commute 21 Copy 88 Erase 102 Expand 142 Call 63 Branch 41

Memory Heap 1_184 B Allocated 19_520 B Freed 19_520 B

Performance Time 0 ms Speed 6_245_059 IPS

Apparently the stats are default. Which makes this feel like a toy but again, whatever. Running that first way gives:

Interactions Total 1_069 Annihilate 562 Commute 21 Copy 96 Erase 114 Expand 163 Call 67 Branch 46

Memory Heap 1_184 B Allocated 22_000 B Freed 22_000 B

Performance Time 0 ms Speed 7_088_859 IPS

IPS seems quite variable at this small an input size. So my conclusion is: if it helps one write code better, go for it. Who cares. But this certainly doesn't seem like some magic bullet that doesn't need to go through the array twice. The ivy code is not very human friendly so I couldn't wrap my head around what it's actually doing, but it looks like it's a bunch of functional programming. It really does look quite slow and mathematically pure. So as a math toy I think it's doing a good job.

bbminner

"This function calculates the minimum of a list of numbers, and subtracts every number in the list by that minimum. This function currently iterates over the list twice; once to calculate the minimum, and once to do the subtraction. At a first glance, it looks like there is no way to merge these two loops, because you need to calculate the minimum of all the numbers before you can subtract from any of them. As it turns out, this is possible, using the inverse operator."

https://vine.dev/docs/features/inverse

I was long curious about interaction nets. Can someone confirm if Vine will automagically derive an algorithm for doing this with a single pass or it will do one pass in "forward mode", but once the minimum is resolved, follow it up with a second implicit "book keeping and variable re-resolution pass" that will be equivalent to an actual full second pass you'd get in an imperative language?

It is all very much reminiscent of programming in tensorflow 0.x.

tjjfvi

> it will do one pass in "forward mode", but once the minimum is resolved, follow it up with a second implicit "book keeping and variable re-resolution pass" that will be equivalent to an actual full second pass you'd get in an imperative language

Yes, this is correct. Though it's worth noting that this is not some transformation being done by the compiler, but an emergent property of the interaction net.

anonzzzies

I was thinking it must be possible to do something semantically similar in Prolog:

   find_min_and_subtract([H], [R], Min) :-  
       Min = H,
       R is H - Min.  
   find_min_and_subtract([H|T], [R|Rs], Min) :-
       find_min_and_subtract(T, Rs, TailMin),
       (H < TailMin -> Min = H ; Min = TailMin),
       R is H - Min,!.
Which [9,4,12,2] -> [7, 2, 10, 0].

But this doesn't work for all cases anyway (just the ones where the last value is also the minimum). I feel it can be fully expressed in one pass with miniKanren instead but no time to figure that out now; it can defer the arithmetic by setting up constraints.

Always liked the idea of 'using variables in the past' (or creating them in the future) and the first amazement moment was with Prolog (there were quite a lot of wow! moments in Prolog for the young me who only knew basic, pascal & c, like when you have some 'write multiply in prolog', when you deliver f(x,y)=x*y, you can not only do f(2,3) and get 6, but also f(2,y)=6 and get y=3), but, so it made me think of it.

ngruhn

Prolog also has constraints! You need to import a library called CLP(FD) / CLP(Z) (depending on the Prolog system) and instead of =, <, =<, etc. you need to use #=, #<, #=<, etc.

    :- use_module(library(clpfd)).

    min(Min, Y, Min) :- Min #=< Y, !.
    min(X, Min, Min) :- Min #=< X.

    find_min_and_subtract_([], [], Min, Min).
    find_min_and_subtract_([H|T], [R|Rs], TempMin, Min) :-
        min(TempMin, H, NewTempMin),
        R #= H - Min,
        find_min_and_subtract_(T, Rs, NewTempMin, Min).

    find_min_and_subtract([H|Hs], Rs, Min) :-
        find_min_and_subtract_([H|Hs], Rs, H, Min). 
With that (and an additional variable for the temporary minimum) it works for me:

    ?- find_min_and_subtract([9,2,4,12], Rs, Min).
    Min = 2,
    Rs = [7, 0, 2, 10]
https://swish.swi-prolog.org/p/qgchfMfB.pl

anonzzzies

OW! Thanks, I did not know about that. Last I user Prolog was mid 90s, but I really liked it, just did not have any practical use for it after graduation. Thanks for that!

forgotpwd16

Reading this page made me feel same way when was watching Tenet.

stuhood

I think that `inverting` also subsumes async functions/values, which is pretty neat!

In the case where asynchrony was actually necessary, it seems like a great alternative to function coloring.

But whether you should actually use it for something like their `sub_min` example is highly dependent on how good the performance of their implementation is. Creating a graph of references rather than making two passes over an array of integers is not clearly faster ... or clearer, for that matter.

chc4

That's a very cute feature! Thinking about it in terms of thunks makes it pretty understandable to me.

bbminner

Yes, thunks with sugar is my thinking as well. Does not explain the "last unassignment" examples though. I don't quite understand why it's needed though :)

jmholla

I'm a bit confused. How does this not end up as two for loops behind the scenes? The inverse operator still has to be applied to every member after the initial iteration. Or am I missing something?

sw1sh

I like Inverse types! While it's not novel idea, this seems like a very concise implementation of it. Inverse types are also known as negative types (https://legacy.cs.indiana.edu/~sabry/papers/rational.pdf) or holes. This concept is what needed to make a concrete category for computation compact (highly recommend: http://arxiv.org/abs/0903.0340) and make the connection to how our physical universe actually works on a quantum level, with antiparticles performing backward computation. I'm pretty sure this model of computation of filling holes with inference would allow us to understand advantages of quantum computation much better.

Do you have any thoughts on HVM's SupGen for program synthesis? I would love to understand how interaction nets makes the notion of inverse types and filling holes more natural and efficient.

ngruhn

I first heard about interaction nets from HVM [1]. It sounds very interesting but I can't say I get it.

[1] https://higherorderco.com/

hyperbrainer

I am very interested in the work they are doing with Bend and Kind, but right now, I am just confused as to what language is actually being actively developed. As I see it, right now they have HVM3(the runtime), Bend, Kind, Kind2, and some other stuff. No idea how all of that is supposed to tie in together.

sumnole

> Think inets as like sort of lambda calculus runtime: > lambdas -> inets -> reduced inets -> lambdas

a bit like Tree Calculus but with nets.

radenmuaz

[dead]

szvsw

Related article I’m enjoying for more context on interaction nets:

https://wiki.xxiivv.com/site/interaction_nets.html

zie1ony

Implications looks weird https://vine.dev/docs/features/conditions#implication:

    let y = None;
    y is Some(value) => value > 0 // true
    y is Some(value) => value == 0 // true

tjjfvi

Author here, happy to answer any questions.

jonahx

Constructive criticism, but I checked out the docs hoping for an up-front explanation of interaction nets and how they can improve a PL, what kinds of problems they are uniquely good for, etc. Finding none, I read the interaction nets wiki page, but still wasn't sure why "a PL based on Interaction Nets" is something I'd want. Of course, if the answer is "why not" in an esolang sense that's great, but I sensed there was something more and wanted to know what it was. In short, why did you make this language.

EDIT: btw, I knew I recognized your handle and it was from this amazing code golf answer:

https://codegolf.stackexchange.com/questions/108170/totally-...

tjjfvi

Yeah, that would be good to add to the docs.

Interaction nets are an alternate model of computation (along the lines of the lambda calculus or Turing machines). It has several interesting properties, one of the most notable being that it is fundamentally parallel. https://en.wikipedia.org/wiki/Interaction_nets https://t6.fyi/guides/inets

I think there are many potential applications for interaction nets in parallel & distributed computing, among other fields; and such applications will need a language – hence Vine.

(re your edit – that's fun lol; Totally Cubular is one of my favorite projects)

azeirah

Do interaction nets have any interesting properties that make them useful for distributed computing? Think IoT, edge computing, embedded, that kind of thing?

And synchronization primitives?

I wanted to also say I loved reading the documentation. Your idea of places, spaces and values feels like a waaay more intuitive naming scheme for than what's common in CS.

The time-travel example also felt oddly intuitive to me. I don't really care that it uses black magic underneath, it's quite elegant!

MJGrzymek

This is something I never got through my annual rechecking of the interaction nets wiki page, cause isn't the lambda calculus also fundamentally parallel? Are nets even more?

calebh

Isn't any pure language easy to parallelize? What advantage do interaction nets offer over a parallelizing compiler for a purely functional language?

szvsw

Have you thought about how autodiff can be baked into the lang? Given that it is already seemingly all representing computation with graphs of operations, building AD into it seems like a natural extension that could open up some pretty useful applications/bring in some scientific computing interest.

I’m also curious to think about how the “time travel” might interact (if at all) with (1) basic algebra and (2) building off that, systems of equations and (3) building off of that something like implicit/crank-nicholson finite difference schemes.

Without having thought about it much, it feels intuitively like there might be an elegant way of expressing such algorithms using this notion of information traveling backwards, but perhaps not.

As an aside, although I’m not really a fan, the representation of time travel in Tenet seems like a natural analogy to include in your docs for fun if desired. More relevant, talking about things like backprop in standard ML packages might be one way of providing a shared reference point for contextualizing information flowing backward through a computational graph.

Finally, echoing what others have said about not burying the lede in the docs, as well as wanting to see more motivation/examples/discussion of interaction nets.

radenmuaz

Why Vine and Ivy instead of Bend and HVM?

Kinrany

I would love a better explanation for `~~x = x`. I first understood `~x` as an equivalent of an unresolved future.

Would one be able to model `~x` with, say, Rust channels?

tjjfvi

In interaction nets, one of the core primitives is the 'wire', which form the edges in the graph. One can think of a wire as like a one-shot channel; it has a 'producer' side, which sends some value across, and a 'consumer' side, which does something with that value.

In that context, then, the inverse operator switches which side of the wire you're talking about. If you have a parameter of type `N32`, you're on the 'consumer side'. But if you have a parameter of type `~N32`, that can be viewed as being the 'producer side' of an `N32` channel. Since `~` just swaps the sides, `~~T` is the same as `T`.

Kinrany

Is it possible to put the 'producer side' into a 'consumer side' of another wire?

Should ~ be considered part of the type signature? Does `let ~x = y` work as destructuring?

Kinrany

What happens when a variable is assigned more than once?

FridgeSeal

This is really, really cool!

I don’t pretend to understand all the theory for nets stuff, but the premise seems super cool and I love languages that have experimental and unusual ideas!

Also liked the dedicated section about the compiler design: the description of the different passes is cool.

teabee89

About the inverse feature: it is not clear to me how code using it evaluates and how much faster it is. But I find it confusing when reading the sub_min[1] function and reminds me how much I value readability. I'm not convinced that feature should be exposed just because interaction nets allow you to flow time backwards. Better free parallelism is already a great improvement!

[1] https://vine.dev/docs/features/inverse

shahbazac

Please provide an example on the first page.

teabee89

What are the differences with HVM?

tjjfvi

There are many. One major difference is that HVM/Bend focus entirely on functional programming, whereas Vine supports both imperative and functional patterns, in a cohesive manner.

vintermann

But that's just about how things look in the higher level language, right? Implementation wise, it's still interaction nets, so...?

I'm interested in a wider comparison, and understanding why you didn't go with HVM, especially as the low-level runtime.

dang

(We changed the URL from https://vine.dev/ to the linked page that starts off with a bit more background)

black7375

Awesome. It looks like Easy mode rust.

1. Since it is based on Interaction Net, is it parallelized well?

2. Will there be a borrow checker like Rust does?

Garlef

Interesting language.

Is it maybe possible to express the plot of "Tenet" using the inverse operator?

butterisgood

Feels like a syntax wrapped around propagator networks.

In 1s compliment we can have negative and positive 0.

0 is not a “natural” number in the traditional sense.

It’s an interestingly opinionated language.