Skip to content(if available)orjump to list(if available)

A programming language made for me

mrkeen

> In Odin all variables are automatically zero initialized. Not just integers and floats. But all structs as well. Their memory is filled with zeroes when those variables are created.

> This makes ZII extra powerful! There is little risk of variables accidentally being uninitialized.

The cure is worse than the problem. I don't want to 'safely' propagate my incorrect value throughout the program.

If we're in the business of making new languages, why not compile-time error for reading memory that hasn't been written? Even a runtime crash would be preferable.

lerno

I always find this opinion intriguing, where it's apparently fine that globals are initialized to zero, but you are INSANE to suggest it's the default for locals. What kind of programs are y'all writing?

Clearly the lack of zeroing in C was a trade-off at the time. Just like UB on signed overflow. And now people seem to consider them "obvious correct designs".

Tuna-Fish

I'd prefer proper analysis for globals too, but that is substantially harder.

"Improperly using a variable before it is initialized" is a very common class of bug, and an easy programming error to make. Zero-initializing everything does not solve it! It just converts the bugs from ones where random stack frame trash is used in lieu of the proper value into ones where zeroes are used. If you wanted a zero value, it's fine, but quite possibly you wanted something else instead and missed it because of complex initialization logic or something.

What I want is a compiler that slaps me when I forget to initialize a proper value, not one that quietly picks a magic value it thinks I might have meant.

tlb

Being initialized to zero is at least repeatable, so if you forget to initialize something you'll notice it immediately in testing. The worst part about uninitialized variables is that they frequently are zero and things seem to work until you change something else that previously happened to use the same memory.

D4ckard

> The worst part about uninitialized variables is that they frequently are zero and things seem to work until you change something else that previously happened to use the same memory.

This is not the whole story. You're making it sound like uninitialized variables _have_ a value but you can't be sure which one. This is not the case. Uninitialized variables don't have a value at all! [1] has a good example that shows how the intuition of "has a value but we don't know which" is wrong:

  use std::mem;
  
  fn always_returns_true(x: u8) -> bool {
      x < 120 || x == 120 || x > 120
  }
  
  fn main() {
      let x: u8 = unsafe { mem::MaybeUninit::uninit().assume_init() };
      assert!(always_returns_true(x));
  }
If you assume an uninitialized variable has a value (but you don't know which) this program should run to completion without issue. But this is not the case. From the compiler's point of view, x doesn't have a value at all and so it may choose to unconditionally return false. This is weird but it's the way things are.

It's a Rust example but the same can happen in C/C++. In [2], the compiler turned a sanitization routine in Chromium into a no-op because they had accidentally introduced UB.

[1]: https://www.ralfj.de/blog/2019/07/14/uninit.html

[2]: https://issuetracker.google.com/issues/42402087?pli=1

gingerBill

> You're making it sound like uninitialized variables _have_ a value but you can't be sure which one.

Because that's a valid conceptualization you could have for a specific language. Your approach and the other person's approach are both valid but different, and as I said in another comment, they come with different compromises.

If you are thinking like some C programmers, then `int x;` can either have a value which is just not known at compile time, or you can think of it having a specialized value of "undefined". The compiler could work with either definition, it just happens that most compilers nowadays do for C and Rust at least use the definition you speak of, for better or for worse.

null

[deleted]

gingerBill

You're assuming that's the style of programming others want to program in. Some people want the "ZII" approach. Your approach is a trade-off with costs which many others would not want to make. So it's not "preferable", it's a different compromise.

iainmerrick

That's clearly correct, as e.g. Go uses this style and there are lots of happy Go users.

I want to push back on the idea that it's a "trade-off", though -- what are the actual advantages of the ZII approach?

If it's just more convenient because you don't have to initialize everything manually, you can get that with the strict approach too, as it's easy to opt-in to the ZII style by giving your types default initializers. But importantly, the strict approach will catch cases where there isn't a sensible default and force you to fix them.

Is it runtime efficiency? It seems to me (but maybe not to everyone) that initialization time is unlikely to be significant, and if you make the ZII style opt-in, you can still get efficiency savings when you really need them.

The explicit initialization approach seems strictly better to me.

gingerBill

> It seems to me... that initialization time is unlikely to be significant

The thing is, initialization cost is a lot more than you think it is, especially when it's done on a per-object level rather than a "group" level.

This is kind of the point of trying to make the zero value useful, it's trivially initialized. And in languages that are much more strict in their approach, it is done at that per-object level which means instead of the cost of initialization being anywhere from free (VirtualAlloc/mmap has to produce zeroed memory) to trivially-linear (e.g. memset), to being a lot more nested hierarchies of initialization (e.g. for-loop with constructor for each value).

It's non-obvious why the "strict approach" would be worse, but it's more about how people actually program rather than a hypothetical approach to things.

So of course each style is about trade-offs. There are no solutions, only trade-offs. And different styles will have different trade-offs, even if they are not immediately obvious and require a bit of experience.

A good little video on this is from Casey Muratori, "Smart-Pointers, RAII, ZII? Becoming an N+2 programmer": https://www.youtube.com/watch?v=xt1KNDmOYqA

D4ckard

I agree that zero-initializing doesn't really help avoid incorrect values (which is what the author focuses on) but at least you don't have UB. This is the main selling point IMO.

yusina

Then why not just require explicit initialization? If "performance" is your answer then adding extra optimization capabilities to the compiler that detects 0 init would be a solution which could skip any writes if the allocator guarantees 0 initialization of allocated memory. A much safer alternative. Replacing one implicit behavior with another is hardly a huge success...

90s_dev

I'd guess it was because 0 init is desired often enough that this is a convenient implicit default?

ratatoskrt

> why not compile-time error for reading memory that hasn't been written?

so... like Rust?

Timwi

Curiously, C# does both. It uses compile-time checks to stop you from accessing an uninitialized local and from exiting a struct constructor without initializing all fields; and yet, the CLR (the VM C# compiles to) zero-initializes everything anyway.

mrkeen

This is a pain. I recently switched from Java (and its whole Optional/null mess) to C#. I was initially impressed by its nullable checks, but then I discovered 'default'. Now I gotta check that Guids aren't 0000...? It makes me miss the Java situation.

null

[deleted]

dontlaugh

That’s likely because p/invoke is quite common.

slowmovintarget

Here's Casey Muratori on his habit of moving to ZII: https://www.youtube.com/watch?v=xt1KNDmOYqA

Much better outcomes and failure modes than RAII. IIRC, Odin mentions game programming as one of its use cases.

jkercher

When I first heard about Odin, I thought, why another C replacement?! What's wrong with rust or zig? Then, after looking into it, I had a very similar experience to the author. Someone made a language just for me! It's for people who prefer C over C++ (or write C with a C++ compiler). It has the things that a C programmer has to implement themselves like tagged unions, slices, dynamic arrays, maps, and custom allocators. While providing quality of life features like distinct typing, multiple return values, and generics. It just hits that sweet spot. Now, I'm spoiled.

karl_zylinski

It's indeed some kind of sweet spot. It has those things from C I liked. And it made my favorite workflows from C into "first class citizens". Not everyone likes those workflows, but for people like me it's pretty ideal.

christophilus

Yep. It’s my favorite C-replacement. It compiles fast. It has all of the pieces and abstractions I care about and none of the cruft I don’t.

D4ckard

You can do lot's of the same things in C too, as the author mentions, without too much pain. See for example [1] and [2] on arena allocators (which can be used exactly as the temporary allocator mentioned in the post) and on accepting that the C standard library is fundamentally broken.

From what I can tell, the only significant difference between C and Odin mentioned in the post is that Odin zero-initializes everything whereas C doesn't. This is a fundamental limitation of C but you can alleviate the pain a bit by writing better primitives for yourself. I.e., you write your own allocators and other fundamental APIs and make them zero-initialize everything.

So one of the big issues with C is really just that the standard library is terrible (or, rather, terribly dated) and that there is no drop-in replacement (like in Odin or Rust where the standard library seems well-designed). I think if someone came along and wrote a new C library that incorporates these design trends for low-level languages, a lot of people would be pretty happy.

[1]: https://www.rfleury.com/p/untangling-lifetimes-the-arena-all...

[2]: https://nullprogram.com/blog/2023/10/08/

gingerBill

The author literally says that they used to do that in C. And I've done a lot of those things in C too, it just doesn't mean that C has good defaults nor good ergonomics for many of the tasks other languages have be designed to be good with.

9dev

I am not a C programmer, but I have been wondering this for a long time: People have been complaining about the standard library for literal decades now. Seemingly, most people/companies write their own abstractions on top of it to ease the pain and limit exposure to the horrors lurking below.

Why has nobody come along and created an alternative standard library yet? I know this would break lots of things, but it’s not like you couldn’t transition a big ecosystem over a few decades. In the same time, entire new languages have appeared, so why is it that the C world seems to stay in a world of pain willingly?

Again, mind you, I’m watching from the outside, really just curious.

dspillett

> Why has nobody come along and created an alternative standard library yet?

Probably, IMO, because not enough people would agree on any particular secondary standard such that one would gain enough attention and traction¹ to be remotely considered standard. Everyone who already has they own alternatives (or just wrappers around the current stdlib) will most likely keep using them unless by happenstance the new secondary standard agrees (by definition, a standard needs to be at least somewhat opinionated) closely with their local work.

Also, maintaining a standard, and a public implementation of it, could be a faffy and thankless task. I certainly wouldn't volunteer for that!

[Though I am also an outsider on the matter, so my thoughts/opinions don't have any particular significance and in insider might come along and tell us that I'm barking up the wrong tree]

--------

[1] This sort of thing can happen, but is rare. jquery became an unofficial standard for DOM manipulation and related matters for quite a long time, to give one example - but the gulf between the standard standard (and its bad common implementations) at the time and what libraries like jquery offered was much larger than the benefits a secondary C stidlib standard might give.

HexDecOctBin

> Why has nobody come along and created an alternative standard library yet?

Everybody has created their own standard library. Mine has been honed over a decade, why would I use somebody else's? And since it is designed for my use cases and taste, why would anyone use mine?

yusina

> Why has nobody come along and created an alternative standard library yet?

Because people are so terribly opinionated that the only common denominator is that the existing thing is bad. For every detail that somebody will argue a modern version should have, there will be somebody else arguing the exact opposite. Both will be highly opinionated and for each of them there is probably some scenario in which they are right.

So, the inability of the community to agree on what "good" even means, plus the extreme heterogenity of the use cases for C is probably the answer to your question.

gingerBill

Because to be _standard_, it would have to come with the compiler toolchain. And if it's scattered around on the internet, people will not use it.

I tried to create my own alternative about a decade ago which eventually influenced my other endeavours.

But another big reason is that people use C and its stdlib because that's what it is. Even if it is bad, its the "standard" and trivially available. Most code relies on it, even code that has its own standard library alternative.

arp242

> I think if someone came along and wrote a new C library that incorporates these design trends for low-level languages, a lot of people would be pretty happy.

I suppose glib comes the closest to this? At least the closest that actually sees fairly common usage.

I never used it myself though, as most of my C has been fairly small programs and I never wanted to bother people with the extra dependency.

leecommamichael

Odin was made for me, also. It has been 4 years and I’m still discovering little features that give me the control and confidence I wish I’d had writing C++.

I returned to the language after a stint of work in other tech and to my utter amazement, the parametric polymorphism that was added to the language felt “right” and did not ruin the comprehensibility of the core library.

Thank you gingerBill!

yusina

As long as programmers view a program as a mechanism that manipulates bytes in flat memory, we will be stuck in a world where this kind of topic seems like a success. In that world, an object puts some structure above those memory bytes and obviously an allocator sounds like a great feature. But you'll always have those bytes in the back of your mind and will never be able to abstract things without the bytes in memory leaking through your abstractions. The author even gives an example for a pretty simple scenario in which this is painful, and that's SOA. As long as your data abstraction is fundamentally still a glorified blob of raw bytes in memory, you'll be stuck there.

Instead, data needs to be viewed more abstractly. Yes, it will eventually manifest in memory as bytes in some memory cell, but how that's layouted and moved around is not the concern of you as the programmer that's a user of data types. Looking at some object attributes foo.a or foo.b is just that - the abstract access of some data. Whether a and b are adjacent in memory should be insubstantial or are even on the same machine or are even backed by data cells in some physical memory bank. Yes, in some very specific (!) cases, optimizing for speed makes it necessary to care about locality, but for those cases, the language or library need to provide mechanisms to specify those requirements and then they will layout things accordingly. But it's not helpful if we all keep writing in some kind of glorified assembly language. It's 2025 and "data type" needs to mean something more abstract than "those bytes in this order layed out in memory like this", unless we are writing hand-optimized assembly code which most of us never do.

lynx97

Well, the DOD people keep finding that caring about the cache is more helpful regaring performance then the casual programmer might think. Even compiler people are thinking about ditching the classical AST for something DOD-based. I admin HPC systems as a dayjob, and I rarely see programmers aware of modern CPU design and how to structure your data such that it actually performs. I get that you'd like to add more abstractions to make programming easier, but I worry that this only adds to the (already rampant) inefficiency of most programs. The architecture is NOT irrelevant. And with every abstraction you put in, you increase the distance the programmer has from knowing how the architecture works. Maybe thats fine for Python and other high level stuff, but it is not a good idea IMO when dealing with programs with longer runtimes...

bob1029

> caring about the cache is more helpful regaring performance then the casual programmer might think.

Cache is easily the most important consideration if you intend to go fast. The instructions are meaningless if they or their dependencies cannot physically reach the CPU in time.

The latency difference between L1/L2 and other layers of memory is quite abrupt. Keeping workloads in cache is often as simple as managing your own threads and tightly controlling when they yield to the operating system. Most languages provide some ability to align with this, even the high level ones.

whstl

IMO, DOD shows that you don’t have to sacrifice developer ergonomics for performance.

ECS is vastly superior as an abstraction that pretty much everything that we had before in games. Tightly coupled inheritance chains of the 90s/2000s were minefields of bugs.

Of course perhaps not every type of app will have the same kind of goldilocks architecture, but I also doubt anyone will stumble into something like that unless they’re prioritizing it, like game programmers did.

gingerBill

I won't get into it too much but virtually no one needs ECS, and if you have to ask how to do it, it's not for you. There are much better ways to organize a game for most people than the highly generic relational-database-like structure that is ECS. ECS does make sense in certain contexts but most people do not need it.

But I agree that DOD in practice is not a compromise between performance and ergonomics, and Odin kind of shows how that is possible.

yusina

That's great! Let the compiler figure out the optimal data layout then! Of course the architecture is relevant. But does everybody need to consider L2 and L3 sizes all the time? Optimizing this is for machines, with very rare exceptions. Expecting every programmer to do optimal data placement by hand is similar to expecting every programmer to call malloc and free in the right order and the correct number of times. And we know how reliable that turned out.

lynx97

I am reluctant to believe compiler optimisations can do everything. Kind of reminds me of the time when people thought auto parallelisation would be a plausible thing. It never really happened, at least not in a predictably efficient way.

johnnyjeans

> That's great! Let the compiler figure out the optimal data layout then!

GHC, which is without a doubt the smartest compiler you can get your grubby mitts on, is still an extremely stupid git that can't be trusted to do basic optimizations. Which is exactly why it exposes so many special intrinsic functions. The "sufficiently smart compiler" myth was thoroughly discounted over 20 years ago.

gingerBill

The compiler cannot know the _purpose_ of your program, and thus cannot "figure out the optimal data layout". It's metaphysically not possible, let alone technically.

Not everybody needs to worry about L2 or L3 most of the time, but if you are using a systems-level programming language where it might be of a concern to you at some point, it's extremely useful to be able to have that control.

> expecting every programmer to call malloc and free in the right order

The point of custom allocators is to not need to do the `malloc`/`free` style of memory allocation, and thus reduce the problems which that causes. And even if you do still need that style, Odin and many other languages offer features such as `defer` or even the memory tracking allocator to help you find the problems. Just like what was said in the article.

gingerBill

> As long as programmers view a program as a mechanism that manipulates bytes in flat memory...

> Yes, it will eventually manifest in memory as bytes in some memory cell...

So people view a program how the computer actually deals with it? And how they need to optimize for since they are writing programs for that machine?

So what is an example of you abstraction that you are talking about? Is there a language that already exists that is closer to what you want? Otherwise you are talking vaguely and abstractly and it doesn't really help anyone understand your point of view.

yusina

Real world example. You go sit in your ICE car. You press the gas pedal and the car starts moving. And that's your mental model. Depressing pedal = car moves. You do not think "depress pedal" = "more gasoline to the engine" = stronger combustion" = "higher rpm" = "higher speed". But that's the level those C and C-like language discussions are always on. The consequence of you using this abstraction in your car is that switching to a hybrid or lately an EV is seemless for most people. Depress pedal, vehicle moves faster. Whether there is a battery involved or some hydrogen magic or an ICE is insubstantial. Most of the time. Exceptions are race track drivers. But even those drop off their kids at school during which they don't really care what's under the hood as long as "depress pedal" = "vehicle moves faster".

Intermernet

This may be true, but it's also false. Many regular drivers have an understanding of how the machine they're driving works. Mechanical sympathy is one of the most important things I've ever learnt. It applies to software as well. Knowing how the data structures are laid out in memory, knowing how the cache works, knowing how the compiler messes with the loops and the variables. These aren't necessarily vital information, and good compilers mean that you can comfortably ignore much of these things, but this knowledge definitely makes you a better developer. Same as knowing how the fuel injection system or the aspiration of your ICE will make you a better driver.

pjc50

The perfect analogy, because sometimes people want to drive a manual car, and sometimes people aren't American and it's the default.

johnnyjeans

> You do not think

Actually I do, and I include the inertia and momentum of every piece of the drive-train as well, and the current location of the center of gravity. I'm thinking about all of these things through the next predicted 5 seconds or so at any given time. It comes naturally and subconsciously. To say nothing of how you really aren't going to be driving a standard transmission without that mental model.

Your analogy is appropriate for your standard American whose only experience with driving a car is the 20 minute commute to work in an automatic, and thus more like a hobbyist programmer or a sysadmin than someone whose actual craft is programming. Do you really think truckers don't know in their gut what their fuel burn rate is based on how far they've depressed the pedal?

yusina

And you were perhaps asking about programming languages. Python does not model objects as bytes in physical memory. Functional languages normally don't. That all has consequences, some of which the "close to the metal" folks don't like. But throwing the "but performance" argument at anyhing that moves us beyond the 80s is really getting old.

hoseja

Uhhhhh that's kind of how I think about the gas pedal though. There's some lag. The engine might stall a bit if you try to accelerate uphill in a wrong way. There's ideal RPM range. Etc.

card_zero

It's always current_year, and I like bytes, thanks.

rixed

Ideally, the same language would allow programmers to see things at different abstraction levels, no? Because when you are stuck with bytes and allocators and doing everything else manually, it's detious and you develop hand arthritis in your 30s. But when you have only abstractions and the performances are inacceptable because no magic happened, then it's not great either.

Philpax

While I agree with you to some extent - working with a higher-level language where you _don't_ have that kind of visibility is its own kind of liberating - Odin is very specifically not that kind of language, and is designed for people who want or need to operate in a machine-sympathetic fashion. I don't think that's necessary all the time, but some form of it does need to exist.

null

[deleted]

StopDisinfo910

> Instead, data needs to be viewed more abstractly.

There is no instead here. This is not a choice that has to be made once and for all and there is no correct way to view things.

Languages exist if you want to have a very abstract view of the data you are manipulating and they come with toolchains and compilers that will turn that into low level representation.

That doesn’t preclude the interest of languages which expose this low level architecture.

yusina

Sure. But solving problems at the wrong level of abstraction is always doomed to fail.

StopDisinfo910

That would be true if it was always the wrong level of abstraction.

It's obviously not for the low level parts of the toolchain which are required to make very abstract languages work.

ulbu

and we should probably look at alcoholic liver disease as an expression of capitalism.

data is bytes. period. your suggestion rests on someone else seeing how it is the case and dealing with it to provide you with ways of abstraction you want. but there is an infinity of possible abstractions – while virtual memory model is a single solid ground anyone can rest upon. you’re modeling your problems on a machine – have some respect for it.

in other words – most abstractions are a front-end to operations on bytes. it’s ok to have various designs, but making lower layers inaccessible is just sad.

i say it’s the opoposite – it’s 2025, we should stop stroking the imaginaries of the 80s and return to the actual. just invest in making it as ergonomic and nimble as possible.

i find it hard understand why some programmers are so intent on hiding from the space they inhabit.

codr7

Which parts of the C standard library has any need for allocators?

gingerBill

Loads of libc allocate. The trivial ones being malloc/calloc/free/strdup/etc, but many other things within it will also allocate like qsort. And that means you cannot change how those things allocate either.

jay_kyburz

I've been messing around with Odin and Raylib for a few weeks. I've been interested in trying Raylib for a long time, it has a huge list language bindings. I chose Odin for different reasons than I think many would. Perhaps superficial reasons.

I'm a game-play programmer and not really into memory management or complex math. I like things to be quick and easy to implement. My games are small. I have no need for custom allocators or SOA. All I want is a few thousand sprites at ~120fps. I normally just work in the browser with JS. I use Odin like it's a scripting language.

I really like the dumb stuff like... no semicolons at the end of lines, no parentheses around conditionals, the case statement doesn't need breaks, no need to write var or let, the basic iterators are nice. Having a built in vector 2 is really nice. Compiling my tiny programs is about as fast as refreshing a browser page.

I also really like C style procedural programing rather than object oriented code, but when you work in a language that most people use as OO, or the standard library is OO, your program will end up with mixed paradigms.

It's only been a few weeks, but I like Odin. It's like a statically typed and compiled scripting language.

karl_zylinski

I like this aspect about Odin. It doesn't try to fundamentally solve any new problems. Instead it does many things right. So it becomes hard to say "this is why you should use Odin". It's more like, try it for yourself and see if you like it :)

jmull

The author is excited that they can do all the things in Odin that they can do in C.

So it strikes me that a new language may be the wrong approach to addressing C's issues. Can they truly not be addressed with C itself?

E.g., here's a list of some commonly mentioned issues:

* standard library is godawful, and composed almost entirely of foot guns. New languages fix this by providing new standard libraries. But that can be done just as well with C.

* lack of help with safety. The solutions people put forward generally involve some combination of static analysis disallowing potentially unsafe operations, runtime checks, and provided implementations of mechanisms around potentially unsafe operations (like allocators, and slices). Is there any reason these cannot be done with C (in fact, I know they all have been done).

* lack of various modern conveniences. I think there's two aspects of this. One is aesthetics -- people can feel that C code is inelegant or ugly. Since that's purely a matter of personal taste, we have to set that aside. The other is that C can often be pretty verbose. Although the syntax is terse, its low-level nature means that, in practice, you can end up writing a relatively large number of lines of code to do fairly simple things. C alternatives tend to provide syntax conveniences that streamline common & preferred patterns. But it strikes me that an advanced enough autocomplete would provide the same convenience (albeit without the terseness). We happen to have entered the age of advanced autocomplete.

Building a new language, along with the ecosystem to support it, is a lot of fun. But it also seems like a very inefficient way to address C's issues because you have to recreate so much (including all the things about C that aren't broken), and you have to reach some critical mass of adoption/usage to become relevant and sustainable. And to be frank, it's also a pretty ineffective way to address C's issues because it doesn't actually do anything to help all the existing C code. Very few projects are in a position to be rewritten. Much better would be to have a fine-grained set of solutions that code bases could adopt incrementally according to need and opportunity

Of course, I realize all this has been happening with C all along. I'm just pointing out that seems like the right approach, while these C alternatives, while fun and exciting (as far as these things go), they are probably just sound and fury that will ultimately fade away. (In fact, it might be worse if some catch on... C and all the C code bases will still be there, we'll just have more fragmentation.)

gingerBill

I'm the creator of the Odin programming language and I originally tried to approach it by fixing C. And my conclusion was that C could not be fixed.

I made my own standard library to replace libc. The lack of safety is hard to do when you don't have a decent enough type system. C's lack of a proper array type is a good example of this.

Before making Odin, I tried making my own C compiler with some extensions, specifically adding proper arrays (slices) with bounds checking, and adding `defer`. This did help things a lot, but it wasn't enough. C still had fundamentally broken semantics in so many places that just "fixing" the problems of C in C was not enough.

I didn't want to make Odin initially, but it was the conclusion I had after trying to fix something that cannot be fixed.

Fraterkes

[flagged]

gingerBill

My hobbies would not be suitable for __HackerNews__. What do you think HackerNews is for?

christophilus

Keep posting, gingerBill. I love Odin threads when they pop up here. And I love Odin. Keep up the good work.

CrimsonRain

For many people, hacking away is (the) hobby.

It's a sad situation when people like you pollute this field with your "computer is just a tool for me to make money" attitude.