Four Years of Jai (2024)
242 comments
·April 15, 2025sph
mjburgess
Iirc, pretty sure jblow has said he's open sourcing it. I think the rough timeline is: release game within the year, then the language (closed-source), then open source it.
Tbh, I think a lot of open source projects should consider following a similar strategy --- as soon as something's open sourced, you're now dealing with a lot of community management work which is onerous.
xigoi
> as soon as something's open sourced, you're now dealing with a lot of community management work which is onerous.
This is a common misconception. You can release the source code of your software without accepting contributions.
chii
> without accepting contributions.
it's not even contributions, but that other people might start asking for features, discuss direction independently (which is fine, but jblow has been on the record saying that he doesn't want even the distraction of such).
The current idea of doing jai closed sourced is to control the type of people who would be able to alpha test it - people who would be capable of overlooking the jank, but would have feedback for fundamental issues that aren't related to polish. They would also be capable of accepting alpha level completeness of the librries, and be capable of dissecting a compiler bug from their own bug or misuse of a feature etc.
You can't get any of these level of control if the source is opened.
mjburgess
I don't think the issue is just contributions. It's the visibility.
When you're a somewhat famous programmer releasing a long anticipated project, there's going to be a lot of eyes on that project. That's just going to come with hassle.
perching_aix
It's not a "misconception". Open source implying open contributions is a very common stance, if not even the mainstream stance. Source availability is for better or for worse just one aspect of open source.
mort96
IMO the main thing they're risking by open sourcing it is adoption. Keeping it closed source is a pretty clear sign to the rest of the world that the language is not ready for widespread adoption. As soon as you open source it, even if you mark it as alpha, you'll end up with people using the language, and breaking changes will at that point break people's code.
0x1ceb00da
There is a lot of experimentation going on as well. Few months ago 2 new casting syntaxes were added for users to evaluate. The plan is to keep only one and remove the others before release.
sph
That’s what I meant by forked. If Jonathan wants to keep his branch closed source, that’s fine, as long as he cuts a release, gives it a GNU license and calls it OpenJai or something. He doesn’t have to deal with the community, somebody will do that for him.
baranul
An argument can easily be make that Jai could have been released as closed-source, some time ago. Many fans and the curious, just want to be able to get their hands on it.
Jon is not going to stop public reaction nor will Jai be perfect, regardless of when he releases. At least releasing sooner, allows it to keep momentum. Not just generated by him, but by third parties, such as books and videos on it. Maybe that's where Jon is making a mistake. Not allowing others to help generate momentum.
pjmlp
Apparently not only do the 90's approach still work pretty well when the language comes with a piece of green coloured hardware, all the ongoing returns to 90's licensing models prove that the free beer approach isn't working when the goal is to build a sustainable business of out the technology.
WalterBright
Some comparison with D:
> Do I have access to an asm keyword,
Yes, D has a builtin assembler
> or can I easily link assembly files?
Yes
> Do I have access to the linker phase to customize the layout of the ELF file?
D uses standard linkers.
> Does it need a runtime to work?
With the -betterC switch, it only relies on the C runtime
> Can I disable the standard library?
You don't need the C runtime if you don't call any of the functions in it.
sph
Thanks. I haven’t played with D since it also had a closed source implementation (10+ years ago) and never kept up with its newer development. I should check it out again.
WalterBright
D is Boost licensed front to back, which is the free'est license out there.
GoblinSlayer
I don't get what's up with the runtime hysteria. All languages have a runtime maybe except for assembler. And linux kernel itself is infamous for being not C by a large margin. And in general remove something important from any program and it will stop working.
muth02446
If you do embedded work, you often want to be in total control of all memory allocations. So it is good to know that the compiler will not produce some invisible heap allocations and there is a useful subset of the standard libray that does not use them either.
az09mugen
There is this streamer that does a lot of interesting language exploring on his own. I don't say you will find all the answers to your questions, but I think you will get a good sense of what you can or cannot do in jai : https://www.youtube.com/results?search_query=Tsoding+jai
sph
Tsoding is great. Don’t be put off by the memelord persona, he’s a genuinely smart guy always exploring some interesting language or library, or reimplementing something from scratch to truly understand it.
estebank
> Don’t be put off by the memelord persona
One can be put off by whatever one is put off by. I've gotten to the point where I realized that I don't need to listen to everyone's opinion. Everyone's got some. If one opinion is important, it will like the shared by more than one person. From that it follows that there's no need to subject oneself to specific people one is put off by. Or put another way: if there's an actionable critique, and two people are stating it, and one is a dick and the other isn't, I'll pay attention to the one who isn't a dick. Life's to short to waste it with abrasive people, regardless of whether that is "what is in their heart" or a constructed persona. The worst effect of the "asshole genius" trope is that it makes a lot of assholes think they are geniuses.
globnomulous
> Don’t be put off by the memelord persona
If it's a persona, then he's at best a performer and entertainer pandering to an audience that enjoys or relates to immature, insufferable people. If it isn't a persona, then he's just an immature, insufferable person.
No, thank you. Either way, the result is psychologically, socially, and politically corrosive and typically attracts a horrendous, overall obnoxious audience.
troupo
You can also watch Jonathan Blow himself writing a commercial game and developing jai on stream: https://www.twitch.tv/j_blow
krapp
Is he actually doing that or is he doing what Casey Muratori's doing with Handmade Hero and taking almost a decade to implement a debug room for a generic top-down Zelda clone?
az09mugen
I did not know about this, I will have a look, thanks !
null
TinkersW
I have my doubts with Jai, the fact that Blow & co seems to have major misunderstandings with regards to RAII doesn't lend much confidence.
Also a 19,000 line C++ program(this is tiny) does not take 45 minutes unless something is seriously broken, it should be a few seconds at most for full rebuild even with a decent amount of template usage. This makes me suspect this author doesn't have much C++ experience, as this should have been obvious to them.
I do like the build script being in the same language, CMake can just die.
The metaprogramming looks more confusing than C++, why is "sin"/"cos" a string?
Based on this article I'm not sure what Jai's strength is, I would have assumed metaprogramming and SIMD prior, but these are hardly discussed, and the bit on metaprogramming didn't make much sense to me.
VyseofArcadia
> Also a 19,000 line C++ program(this is tiny) does not take 45 minutes unless something is seriously broken
Agreed, 45 minutes is insane. In my experience, and this does depend on a lot of variables, 1 million lines of C++ ends up taking about 20 minutes. If we assume this scales linearly (I don't think it does, but let's imagine), 19k lines should take about 20 seconds. Maybe a little more with overhead, or a little less because of less burden on the linker.
There's a lot of assumptions in that back-of-the-envelope math, but if they're in the right ballpark it does mean that Jai has an order of magnitude faster builds.
I'm sure the big win is having a legit module system instead of plaintext header #include
MyOutfitIsVague
It depends heavily on features used, too. C++ without templates compiles nearly as quickly as C.
Jyaif
For 1 million lines of C++ to take 20 minutes you must be building using a single core.
unclad5968
I seriously doubt that any of them have trouble understanding a concept as simple as RAII.
kbr-
Yeah it's weird but the author of this post claiming that defer can replace RAII kinda suggests that. RAII isn't just about releasing the resource you acquired in the current scope in the same scope. You can pass the resource across multiple boundaries with move semantics and only at the end when it's no longer needed the resources will be released.
deagle50
I don't get the point, what does this have to do with defer?
torginus
Honestly I concur. Out of interest in what sort of methods they came up with to manage memory, I checked out the language's wiki, and not sure if going back to 1970s C (with the defer statement on top) is an improvement. You have to write defer everywhere, and if your object outlives the scope of the function, even that is useless.
I'm sure having to remember to free resources manually has caused so much grief, that they decided to come up with RAII, so an object going out of scope (either on the stack, or its owning object getting destroyed) would clean up its resources.
Compared to a lot of low-level people, I don't hate garbage collection either, with a lot of implementations reducing to pointer bumping for allocation, which is an equivalent behavior to these super-fast temporary arenas, with the caveat that once you run out of memory, the GC cleans up and defragments your heap.
If for some reason, you manage to throw away the memory you allocated before the GC comes along, all that memory becomes junk at zero cost, with the mark-and-sweep algorithm not even having to look at it.
I'm not claiming either GC or RAII are faultless, but throwing up your hands in the air and going back to 1970s methods is not a good solution imo.
That being said, I happen to find a lot that's good about Jai as well, which I'm not going to go into detail about.
lylejantzi3rd
There is no RAII in his language. Why would you care if he understands it or not?
rrgok
What an odd take. It is like saying: there is no addition semantic in his language, why would you care if he understands it or not?
lylejantzi3rd
This take is equally bizarre. Most languages have an addition semantic. Most languages do not have RAII. That's, by and large, a C++ thing. Jai does NOT have RAII. So, again, why would anybody care what his opinion on RAII is?
VyseofArcadia
> The net effect of this is that the software you’re running on your computer is effectively wiping out the last 10-20 years of hardware evolution; in some extreme cases, more like 30 years.
As an industry we need to worry about this more. I get that in business, if you can be less efficient in order to put out more features faster, your dps[0] is higher. But as both a programmer and an end user, I care deeply about efficiency. Bad enough when just one application is sucking up resources unnecessarily, but now it's nearly every application, up to and including the OS itself if you are lucky enough to be a Microsoft customer.
The hardware I have sitting on my desk is vastly more powerful that what I was rocking 10-20 years ago, but the user experience seems about the same. No new features have really revolutionized how I use the computer, so from my perspective all we have done is make everything slower in lockstep with hardware advances.
[0] dollars per second
dayvigo
> The hardware I have sitting on my desk is vastly more powerful that what I was rocking 10-20 years ago, but the user experience seems about the same.
Not even.
It used to be that when you clicked a button, things happened immediately, instead of a few seconds later as everything freezes up. Text could be entered into fields without inputs getting dropped or playing catch-up. A mysterious unkillable service wouldn't randomly decide to peg your core several times a day. This was all the case even as late as Windows 7.
pixl97
At the same time it was also that you typed 9 characters in an 8 characters field and you p0wn3e the application.
>Text could be entered into fields without inputs getting dropped or playing catch-up
This has been a complaint since the DOS days that has always been around from my experience. I'm pretty sure it's been industry standard from its inception that most large software providers make the software just fast enough the users don't give up and that's it.
Take something like notepad in opening files. Large files take forever. Yet I can pop open notepad++ from some random small team and it opens the same file quickly.
danpalmer
I understand the attitude but I think it misses a few aspects.
We have far more isolation between software, we have cryptography that would have been impractical to compute decades ago, and it’s used at rest and on the wire. All that comes at significant cost. It might only be a few percent of performance on modern systems, and therefore easy to justify, but it would have been a higher percentage a few decades ago.
Another thing that’s not considered is the scale of data. Yes software is slower, but it’s processing more data. A video file now might be 4K, where decades ago it may have been 240p. It’s probably also far more compressed today to ensure that the file size growth wasn’t entirely linear. The simple act of replaying a video takes far more processing than it did before.
Lastly, the focus on dynamic languages is often either misinformed or purposefully misleading. LLM training is often done in Python and it’s some of the most performance sensitive work being done at the moment. Of course that’s because the actual training isn’t executing in a Python VM. The same is true for so much of “dynamic languages” though, the heavy lifting is done elsewhere and the actual performance benefits of rewriting the Python bit to C++ or something would often be minimal. This does vary of course, but it’s not something I see acknowledged in these overly simplified arguments.
Requirements have changed, software has to do far more, and we’re kidding ourselves if we think it’s comparable. That’s not to say we shouldn’t reduce wastage, we should! But to dismiss modern software engineering because of dynamic languages etc is naive.
sesm
> async/await, a pattern increasingly polluting Javascript and Python codebases in the name of performance
In JS world async/await was never about performance, it was always about having more readable code than Promise chain spagetti.
mppm
Jai's perpetual closed beta is such a weird thing... On the one hand, I sort of get that the developers don't want to waste their time and attention on too many random people trying to butt in with their ideas and suggestions. On the other hand, they are thereby wasting the time and attention of all the people who watched the development videos and read the blog posts, and now can do basically nothing with that knowledge other than slowly forget it. (Except for the few who take the ideas and incorporate them into their own languages).
rglover
The reality of a project like this is that to get it right (which is by the creator's standards, no one else's) takes time. Add on top of that Blow and Thekla are building games with this to dogfood it which takes time, too.
Sadly, there exists a breed of developer that is manipulative, obnoxious, and loves to waste time/denigrate someone building something. Relatively few people are genuinely interested (like the OP) in helping to develop the thing, test builds, etc. Most just want to make contributions for their Github profile (assuming OSS) or exercise their internal demons by projecting their insecurities onto someone else.
From all of the JB content I've seen/read, this is a rough approximation of his position. It's far less stressful to just work on the idea in relative isolation until it's ready (by whatever standard) than to deal with the random chaos of letting anyone and everyone in.
This [1] is worth listening to (suspending cynicism) to get at the "why" (my editorialization, not JB).
Personally, I wish more people working on stuff were like this. It makes me far more likely to adopt it when it is ready because I can trust that the appropriate time was put in to building it.
mppm
I get that. But if you want to work in relative isolation, would it be too much to ask to not advertise the project publicly and wax poetic about how productive this (unavailable) language makes you? Having had a considerable interest in Jai in the past, I do feel a little bit cheated :) even though I realize no binding promises have been made.
troupo
> would it be too much to ask to not advertise the project publicly and wax poetic about how productive this (unavailable) language makes you
All the "public advertisement" he's done was a few early presentations of some ideas and then ... just live streaming his work
ModernMech
It's a common thing in programming language design and circles where some people like to form little cults of personality around their project. Curtis Yarvin did that with his Urbit project. V-Lang is another good example. I consider Elm an example as well.
They get a few "true believer" followers, give them special privileges like beta access (this case), special arcane knowledge (see Urbit), or even special standing within the community (also Urbit, although many other languages where the true believers are given authority over community spaces like discord/mailing list/irc etc.).
I don't associate in these spaces because I find the people especially toxic. Usually they are high drama because the focus isn't around technical matters but instead around the cult leader and the drama that surrounds him, defending/attacking his decisions, rationalizing his whims, and toeing the line.
Like this thread, where a large proportion is discussion about Blow as a personality rather than the technical merit of his work. He wants it that way, not so say that his work doesn't have technical merit, but that he'd rather we be talking about him.
cynical_german
One thing I want to add to the other (so far) good responses: They also seem to build Jai for a means to an end, which is: they are actively developing a game engine with it (to be used for more than one project) and a game, which is already in advanced stages.
If you consider a small team working on this, developing the language seriously, earnestly, but as a means to an end on the side, I can totally see why they think it may be the best approach to develop the language fully internally. It's an iterative develop-as-you-go approach, you're writing a highly specific opinionated tool for your niche.
So maybe it's best to simply wait until engine + game are done, and they can (depending on the game's success) really devote focus and time on polishing language and compiler up, stabilizing a version 1.0 if you will, and "package" it in an appropriate manner.
Plus: they don't seem to be in the "promote a language for the language's sake" game; it doesn't seem to be about finding the perfect release date, with shiny mascot + discord server + full fledged stdlib + full documentation from day one, to then "hire" redditors and youtubers to spread the word and have an armada of newbie programmers use it to write games... they seem to much rather see it as creating a professional tool aimed at professional programmers, particularly in the domain of high performance compiled languages, particularly for games. People they are targeting will evaluate the language thoroughly when it's out, whether that's in 2019, 2025 or 2028. And whether they are top 10 in some popularity contest or not, I just don't think they're playing by such metrics. The right people will check it out once it's out, I'm sure. And whether such a language will be used or not, will probably, hopefully even, not depend on finding the most hyped point in time to release it.
kbr-
> It’s a simple keyword, but it singlehandedly eliminates the need for any kind of RAII.
What if you want to put a resource object (which needs a cleanup on destruction) into a vector then give up ownership of the vector to someone?
I write code in go now after moving from C++ and God do I miss destructors. Saying that defer eliminates need for RAII triggers me so much
estebank
There's a school of thought that correctly states that in that case it is very easy to cause expensive drop behavior to be ran for each element in the vector where a faster batch approach could instead be done, which is doable if not encouraged with defer, so it should be prioritized to push people towards that.
I do not subscribe to that idea because with RAII you can still have batched drops, the only difference between the two defaults is that with defer the failure mode is leaks, while with RAII the failure mode is more code than you'd otherwise would have.
perching_aix
> rather than asking the programmer to do an extraordinary amount of extra work to conform to syntactically enforced safety rules. Put the complexity in the compiler, dudes.
And how would that compiler work? Magic? Maybe clairvoyance?
selfselfgo
[dead]
lifthrasiir
The biggest "crime" of Jai is that it (soft-)launched like an open source programming language and didn't actually become open source shortly. There are so many programming languages that did go through the "beta" period and still remain open sourced all the time. Open source doesn't imply open governance, and most such languages are still evolved almost solely with original authors' judgements. It is fine for Jai to remain closed of course, but there is no practical reason for Jai to remain closed to this day. The resulting confusion is large enough to dismiss Jai at this stage.
chriscbr
Same story with the Mojo language, unfortunately.
To me this raises the question of whether this is a growing trend, or whether it's simply that languages staying closed source tends to be a death sentence for them in the long term.
jimbob45
Yep. Same dilemma as Star Citizen. If both just threw their hands up and said, "Done!", today then everyone would agree that a great product had been released and everyone would be mostly pleased. Instead, development has dragged on so long as to cast doubts over the goodwill of the founders. Now, Jai is unusable because it's difficult to trust Blow if he's willing to lie about that and Star Citizen is unplayable because the game was clearly released under false pretenses.
pcwalton
> I’d be much more excited about that promise [memory safety in Rust] if the compiler provided that safety, rather than asking the programmer to do an extraordinary amount of extra work to conform to syntactically enforced safety rules. Put the complexity in the compiler, dudes.
That exists; it's called garbage collection.
If you don't want the performance characteristics of garbage collection, something has to give. Either you sacrifice memory safety or you accept a more restrictive paradigm than GC'd languages give you. For some reason, programming language enthusiasts think that if you think really hard, every issue has some solution out there without any drawbacks at all just waiting to be found. But in fact, creating a system that has zero runtime overhead and unlimited aliasing with a mutable heap is as impossible as finding two even numbers whose sum is odd.
sph
The faster computers get, the more the GC problem is way overblown apart from super-low-latency niches. Even AAA games these days happily run on GC languages.
There is a prominent contributor to HN whose profile says they dream of a world where all languages offer automatic memory management and I think about that a lot, as a low-level backend engineer. Unless I find myself writing an HFT bot or a kernel, I have zero need to care about memory allocation, cycles, and who owns what.
Productivity >> worrying about memory.
lifthrasiir
GC doesn't exactly solve your memory problem; it typically means that your memory problem gets deferred quite far until you can't ignore that. Of course it is also quite likely that your program will never grow to that point, which is why GC works in general, but also why there exists a desire to avoid it when makes sense.
mjburgess
Not sure why you're down-voted, this is correct.
In games you have 16ms to draw billion+ triangles (etc.).
In web, you have 100ms to round-trip a request under abitarily high load (etc.)
Cases where you cannot "stop the world" at random and just "clean up garbage" are quite common in programming. And when they happen in GC'd languages, you're much worse off.
sph
That’s fair, no resource is unlimited. My point is that memory is usually the least of one’s problem, even on average machines. Productivity and CPU usage tend to be the bottleneck as a developer and a user. GC is mostly a performance problem rather than a memory one, and well-designed language can minimize the impact of it. (I am working on a message-passing language, and only allowing GC after a reply greatly simplifies the design and performance characteristics)
throwawaymaths
eh, there are GC languages famous for high uptimes and deployed in places where it "basically runs forever with no intervention", so in practice with the right GC and application scope, "deferring the concern till the heat death of the universe" (or until a CVE forces a soft update) is possible.
jplusequalt
>Even AAA games these days happily run on GC languages.
Which games are these? Are you referring to games written in Unity where the game logic is scripted in C#? Or are you referring to Minecraft Java Edition?
I seriously doubt you would get close to the same performance in a modern AAA title running in a Java/C# based engine.
steveklabnik
Unreal Engine has a GC.
You're right that there is a difference between "engine written largely in C++ and some parts are GC'd" vs "game written in Java/C#", but it's certainly not unheard of to use a GC in games, pervasively in simpler ones (Heck, Balatro is written in Lua!) and sparingly in even more advanced titles.
Jasper_
Unreal Engine has a C++-based GC.
https://dev.epicgames.com/documentation/en-us/unreal-engine/...
neonsunset
C#? Maybe. Java? Less likely.
Narishma
> Even AAA games these days happily run on GC languages.
You can recognize them by their poor performance.
withoutboats3
This is exactly the attitude this blog post spends its first section pretty passionately railing against.
null
mjburgess
Well, 1) the temporary allocator strategy; and 2) `defer` kinda go against the spirit of this observation.
With (1) you get the benefits of GC with, in many cases, a single line of code. This handles a lot of use cases. Of those it doesn't, `defer` is that "other single line".
I think the issue being raised is the "convenience payoff for the syntax/semantics burden". The payoff for temp-alloc and defer is enormous: you make the memory management explicit so you can easily see-and-reason-about the code; and it's a trivial amount of code.
There feels something deeply wrong with RAII-style langauges.. you're having the burden to reason about implicit behaviour, all the while this behaviour saves you nothing. It's the worst of both worlds: hiddenness and burdensomeness.
hmry
Neither of those gives memory safety, which is what the parent comment is about. If you release the temporary allocator while a pointer to some data is live, you get use after free. If you defer freeing a resource, and a pointer to the resource lives on after the scope exit, you get use after free.
mjburgess
The dialetic beings with OP, and has pcw's reply and then mine. It does not begin with pcw's comment. The OP complains about rust not because they imagine Jai is memory safe, but because they feel the rewards of its approach significantly outweight the costs of Rust.
pcw's comment was about tradeoffs programmers are willing to make -- and paints the picture more black-and-white than the reality; and more black and white than OP.
francasso
While technically true, it still simplifies memory management a lot. The tradeoff in fact is good enough that I would pick that over a borrowchecker.
MJGrzymek
Not sure about the implicit behavior. In C++, you can write a lot of code using vector and map that would require manual memory management in C. It's as if the heap wasn't there.
Feels like there is a beneficial property in there.
pphysch
> Either you sacrifice memory safety or you accept a more restrictive paradigm than GC'd languages give you.
This is true but there is a middle ground. You use a reasonably fast (i.e. compiled) GC lang, and write your own allocator(s) inside of it for performance-critical stuff.
Ironically, this is usually the right pattern even in non-GC langs: you typically want to minimize unnecessary allocations during runtime, and leverage techniques like object pooling to do that.
IOW I don't think raw performance is a good argument for not using GC (e.g. gamedev or scientific computing).
Not being able to afford the GC runtime overhead is a good argument (e.g. embedded programs, HFT).
fleabitdev
It's difficult to design a language which has good usability both with and without a GC. Can users create a reference which points to the interior of an object? Does the standard library allocate? Can the language implement useful features like move semantics and destructors, when GCed objects have an indefinite lifetime?
You'd almost end up with two languages in one. It would be interesting to see a language fully embrace that, with fast/slow language dialects which have very good interoperability. The complexity cost would be high, but if the alternative is learning two languages rather than one...
pphysch
I'm not saying you design a language with an optional GC, I'm saying the user can implement their own allocators i.e. large object pools nested in the GC-managed memory system. And then they get to avoid most of the allocation and deallocation overhead during runtime.
spacechild1
> because most Javascript programmers are entirely unaware of the memory allocation cost associated with each call to anonymous functions
How does calling an anonymous function in JS cause memory allocations?
steveklabnik
I also found this comment a bit strange. I'm not aware of a situation where this occurs, though he might be conflating creating an anonymous function with calling it.
spacechild1
> he might be conflating creating an anonymous function with calling it.
Yeah, that's what I figured. I don't know JS internals all too well, so I thought he might be hinting at some unexpected JS runtime quirk.
nmilo
Probably misspoke, returning or passing anonymous functions cause allocations for the closures, then calling them causes probably 4 or 5 levels of pointer chasing to get the data that got (invisibly) closed over
spacechild1
I don't think there is much pointer chasing at runtime. With lexically scoped closures it's only the compiler who walks the stack frames to find the referenced variable; the compiled function can point directly to the object in the stack frame. In my understanding, closed over variables have (almost) no runtime cost over "normal" local variables. Please correct me if I'm wrong.
nmilo
I meant more like storing closures to be used later after any locals are out of the stack frame, but tbh that's an abstraction that also causes allocations in C++ and Rust. On the other hand, no idea how JS internals work but I know in python getting the length of an array takes five layers of pointer indirection so it very well could be pointer to closure object -> pointer to list of closed variables -> pointer to boxed variable -> pointer to number or some ridiculous thing like that.
malkia
I didn't know much about Jai, and started reading it, and it really has (according to the article) some exciting features, but this caught my eye:
"... Much like how object oriented programs carry around a this pointer all over the place when working with objects, in Jai, each thread carries around a context stack, which keeps track of some cross-functional stuff, like which is the default memory allocator to ..."
It reminds me of GoLang's context, and it should've existed in any language dealing with multi-threading, as a way of carrying info about parent thread/process (and tokens) for trace propagation, etc.
leecommamichael
The Odin programming language uses an implicit context pointer like Jai, and is freely available and open source.
malkia
Thanks! I should check it out!
mustache_kimono
> Software has been getting slower at a rate roughly equivalent to the rate at which computers are getting faster.
Cite?
This problem statement is also such a weird introduction to specifically this new programming language. Yes, compiled languages with no GC are faster than the alternatives. But the problem is and was not the alternatives. Those alternatives fill the vast majority of computing uses and work well enough.
The problem is compiled languages with no GC, before Rust, were bug prone, and difficult to use safely.
So -- why are we talking about this? Because jblow won't stop catastrophizing. He has led a generation of impressionable programmers to believe that we in some dark age of software, when that statement couldn't be further from the truth.
topspin
I carefully watched a number of the early jai language YouTube videos. Some of his opinions on non-programming topics are just goofy: I recall him ranting (and I wish I could find it again,) about the supposed pointlessness of logarithmic scales (decibels, etc.,) vs scientific notation and experiencing a pretty bad cringe spasm.
jblow's words are not the Gospel on high.
troupo
> He has led a generation of impressionable programmers to believe that we in some dark age of software, when that statement couldn't be further from the truth.
Have you actually used modern software?
There's a great rant about Visual Studio debugger which in recent versions cannot even update debugged values as you step through the program unlike its predecessors: https://youtu.be/GC-0tCy4P1U?si=t6BsHkHhoRF46mYM
And this is professional software. There's state of personal software is worse. Most programs cannot show a page of text with a few images without consuming gigabytes of RAM and not-insignificant percentages of CPU.
mustache_kimono
> Have you actually used modern software?
Uh, yes. When was software better (like when was America great)? Do you remember what Windows and Linux and MacOS were like in 90s? What exactly is the software we are comparing?
> There's a great rant about Visual Studio debugger
Yeah, I'm not sure these are "great rants" as you say. Most are about how software with different constraints than video games aren't made with same constraints as video games. Can you believe it?
rk06
I am told that in Visual Studio 2008, you could debug line by line, and it was smooth. Like there was zero lag. Then Microsoft rewrite VS from c++ into c# and it became much slower
Modern software is indeed slow especially when you consider how fast modern hardware is.
If you want to feel the difference, try highly optimised software against a popular one. For eg: linux vs windows, windows explorer vs filepilot, zed vs vscode.
troupo
> Do you remember what Windows and Linux and MacOS were like in 90s? What exactly is the software we are comparing?
Yes, yes I do.
Since then the computer have become several orders of magnitude more powerful. You cannot even begin to imagine how fast and powerful our machines are.
And yet nearly everything is barely capable of minimally functioning. Everything is riddled with loading screens, lost inputs, freeze frames and janky scrolling etc. etc. Even OS-level and professional software.
I now have a AMD Ryzen 9 9950X3D CPU, GeForce RTX 5090 GPU, DDR5 6000MHz RAM and M.2 NVME disks. I should not even see any loading screen, or any operation taking longer than a second. And yet even Explorer manages to spend seconds before showing contents of some directories.
Surprising deep and level headed analysis. Jai intrigues me a lot, but my cantankerous opinion is that I will not waste my energy learning a closed source language; this ain’t the 90s any more.
I am perfectly fine for it to remain a closed alpha while Jonathan irons out the design and enacts his vision, but I hope its source gets released or forked as free software eventually.
What I am curious about, which is how I evaluate any systems programming language, is how easy it is to write a kernel with Jai. Do I have access to an asm keyword, or can I easily link assembly files? Do I have access to the linker phase to customize the layout of the ELF file? Does it need a runtime to work? Can I disable the standard library?