I stopped everything and started writing C again
474 comments
·March 12, 2025kelnos
kstrauser
I think Rust is harder to learn, but once you grok it, I don't think it's harder to use, or at least to use correctly. It's hard to write correct C because the standard tooling doesn't give you much help beyond `-Wall`. Rust's normal error messages are delightfully helpful. For example, I just wrote some bad code and got:
--> src/main.rs:45:34
|
45 | actions.append(&mut func(opt.selected));
| ---- ^^^^^^^^^^^^ expected `&str`, found `String`
| |
| arguments to this function are incorrect
|
help: consider borrowing here
|
45 | actions.append(&mut func(&opt.selected));
|
I even had to cheat a little to get that far, because my editor used rust-analyzer to flag the error before I had the chance to build the code.Also, I highly recommend getting into the habit of running `cargo clippy` regularly. It's a wonderful tool for catching non-idiomatic code. I learned a lot from its suggestions on how I could improve my work.
kelnos
> I think Rust is harder to learn, but once you grok it, I don't think it's harder to use, or at least to use correctly. It's hard to write correct C because the standard tooling doesn't give you much help beyond `-Wall`.
When I say Rust is harder to use (even after learning it decently well), what I mean is that it's still easier to write a pile of C code and get it to compile than it is to write a pile of Rust code and get it to compile.
The important difference is that the easier-written C code will have a bunch of bugs in it than the Rust code will. I think that's what I mean when I say Rust is harder to use, but I'm more productive in it: I have to do so much less debugging when writing Rust, and writing and debugging C code is more difficult and takes up more time than writing the Rust code (and doing whatever less debugging is necessary there).
> Also, I highly recommend getting into the habit of running `cargo clippy` regularly. It's a wonderful tool for catching non-idiomatic code.
That's a great tip, and I usually forget to do so. On a couple of my personal projects, I have a CI step that fails the build if there are any clippy messages, but I don't use it for most of my personal projects. I do have a `cargo fmt --check` in my pre-commit hooks, but I should add clippy to that as well.
sestep
If you're using VS Code then you can add `"rust-analyzer.check.command": "clippy"` to your `settings.json`. I assume there's a similar setting for rust-analyzer in other editors.
kstrauser
That's a fair distinction. Basically, it's easier to write C that compiles than Rust that compiles, but it's harder to write correct C than correct Rust.
Regarding Clippy, you can also crank it up with `cargo clippy -- -Wclippy::pedantic`. Some of the advice at that level gets a little suspect. Don't just blindly follow it. It offers some nice suggestions though, like:
warning: long literal lacking separators
--> src/main.rs:94:22
|
94 | if num > 1000000000000 {
| ^^^^^^^^^^^^^ help: consider: `1_000_000_000_000`
|
that you don't get by default.pkolaczk
Getting something to compile is never the end-goal. It takes 0 effort to get Python compile.
nicoburns
> it's still easier to write a pile of C code and get it to compile than it is to write a pile of Rust code and get it to compile.
As someone who is more familiar with Rust than C: only if you grok the C build system(s). For me, getting C to build at all (esp. if I want to split it up into multiple files or use any kind of external library) is much more difficult than doing the same in Rust.
ohgr
I’d add that the Rust code and C code will probably have the same number of bugs. The C code will likely have some vulnerabilities on top of those.
Rust doesn’t magically make the vast majority of bugs go away. Most of bugs are entirely portable!
epidemian
> Also, I highly recommend getting into the habit of running `cargo clippy` regularly.
You can also have that hooked up to the editor, just like `cargo check` errors. I find this to be quite useful, because i hace a hard time getting into habits, especially for thing that i'm not forced to do in some way. It's important that those Clippy lints are shown as soft warnings instead of hard errors though, as otherwise they'd be too distracting at times.
seba_dos1
> It's hard to write correct C because the standard tooling doesn't give you much help beyond `-Wall`
I won't disagree that correct C is harder to write, but it's not 2005 anymore and standard tooling gives you access to things like asan, msan, ubsan, tsan, clang-tidy...
oconnor663
C-with-sanitizers is miles ahead of what C used to be, but just a couple weeks ago I ran into a dangling pointer bug that ASan doesn't catch. (Sidenote 5 here: https://jacko.io/smart_pointers.html) It seems like one of the big downsides of sanitizers is that they can't instrument other people's code, in this case the C standard library.
mountainriver
Agree, Rust is quite hard to learn, but now that I know it I have a hard time writing anything else. It really gives you the best of a lot of worlds.
Granted I can still crank out a python program faster, that kinda works but god forbid you need to scale it or use any sort of concurrency at all.
kelvinjps10
Go?
crabbone
* Rust errors can be equally unhelpful. Also, the error you posted is hands down awful. It doesn't tell you what went wrong, and it's excessively naive to rely on compiler to offer a correct fix in all but the most trivial cases. When errors happen, it's a consequence of an impasse, a logical contradiction: two mutually exclusive arguments have been made: a file was assumed to exist, but was also assumed not to exist -- this is what's at the core of the error. The idiotic error that Rust compiler gave you doesn't say what were the assumptions, it just, essentially, tells you "here's the error, deal with it".
* In Rust, you will have to deal with a lot of unnecessary errors. The language is designed to make its users create a host of auxiliary entities: results, options, futures, tasks and so on. Instead of dealing with the "interesting" domain objects, the user of the language is mired in the "intricate interplay" between objects she doesn't care about. This is, in general, a woe of languages with extensive type systems, but in Rust it's a woe on a whole new level. Every program becomes a Sisyphean struggle to wrangle through all those unnecessary objects to finally get to write the actual code. Interestingly though, there's a tendency in a lot of programmers to like solving these useless problems instead of dealing with the objectives of their program (often because those objectives are boring or because programmers don't understand them, or because they have no influence over them).
necubi
I don't follow your first point—the compiler is pointing out exactly what the problem is (the argument has the incorrect type) and then telling you what you likely wanted to do (borrow the String). What would you see as a more helpful error message in this case?
chaotic-good
I tend to agree with this.
The code tend to be loaded with primitives that express ownership semantics or error handling. Every time something changes (for instance, you want not just read but also modify values referenced by the iterator) you have to change code in many places (you will have to invoke 'as_mut' explicitly even if you're accessing your iterator through mutable ref). This could be attributed (partially) to the lack of function overload. People believe that overload is often abused so it shouldn't be present in the "modern" language. But in languages like C++ overload also helps with const correctness and move semantics. In C++ I don't have to invoke 'as_mut' to modify value referenced by the non-const iterator because dereferencing operator has const and non-const overloads.
Async Rust is on another level of complexity compared to anything I used. The lifetimes are often necessary and everything is warpped into mutliple layers, everything is Arc<Mutex<Box<*>>>.
j-krieger
> Rust errors can be equally unhelpful. Also, the error you posted is hands down awful. It doesn't tell you what went wrong, and it's excessively naive to rely on compiler to offer a correct fix in all but the most trivial cases
What? It tells the user exactly what's wrong
> Every program becomes a Sisyphean struggle to wrangle through all those unnecessary objects to finally get to write the actual code
That is the cost of non-nullable types and correctness. You still have to do the Sisyphean struggle in other programming languages, but without hints from the compiler.
samiv
The biggest problem with C is that doesn't even have enough features to help you build the features and abstractions you need and want.
For example with C++ the language offers enough functionality that you can create abstractions at any level, from low level bit manipulation to high level features such as automatic memory management, high level data objects etc.
With C you can never escape the low level details. Cursed to crawl.
ChrisMarshallNY
Just FYI.
Back in 1994/95, I wrote an API, in C, that was a communication interface. We had to use C, because it was the only language that had binary/link compatibility between compilers (the ones that we used).
We designed what I call "false object pattern." It used a C struct to simulate a dynamic object, complete with function pointers (that could be replaced, in implementation), and that gave us a "sorta/kinda" vtable.
Worked a charm. They were still using it, 25 years later.
That said, I don't really miss working at that level. I have been writing almost exclusively in Swift, since 2014.
PaulDavisThe1st
> We designed what I call "false object pattern." It used a C struct to simulate a dynamic object, complete with function pointers (that could be replaced, in implementation), and that gave us a "sorta/kinda" vtable.
You were not alone in this. It is the basis of glib's GObject which is at the bottom of the stack for all of GTK and GNOME.
kelnos
Sure, that's a pretty common pattern in use in C to this day. It's a useful pattern, but it's still all manual. Forget to fill in a function pointer in a struct? Crash. At least with C++ it will fail to compile if you don't implement a method that you have to implement.
gtirloni
Of course you can. It's quite the opposite actually. The downside is that in C you have to code a bunch of abstractions _yourself_. See how large projects like the Linux kernel make extensive use of macros to implement an object system.
sim7c00
so confused by this one. C is the most free to build systems and abstractions. it doesnt lock you down into any paradigm so you can build your own....
you can abstract away perfectly fine low level details and use your own high level constructs to build what you want in a memory safe way...
high level daya objects?
sure it doesnt have garbage collection, so it motivates you not to leave garbage laying around for some slow background process to collect.
actually you can build this in C you just do not want to...
you can give all your objects reference counters and build abstractions to use that, you can implement 'smart pointers' if u want, and have threads do garbage collection if u want... why not? what exactly is stopping you?
maybe its less convenient to go that route. but impossible? nope.
connect GDB to your c++ program and see how it works... its not like c++ suddenly doesnt become machine code. and c translates perfectly fine to machine code...
samiv
Sure you can do this, but the point is that it's all manual and always will be because you get no help from the compiler.
Compare in C++ where you can have higher level types and compiler helpfully provides features that let the programmer do stuff such as RAII or unique_ptr.
Huge difference.
quchen
Thank you for your work on XFCE. It's been the best WM/UI for me for over a decade.
WalterBright
> Memory leaks, NULL pointer dereferences, use-after-free
I suffered writing those for many years. I finally simply learned not to do them anymore. Sort of like there's a grain of sand on the bottom of my foot and the skin just sort of entombed it in a callous.
kelnos
I've seen you make these kinds of comments before on other articles. Please stop. Not everyone is perfect and can forevermore avoid making any mistakes. I strongly suspect your opinion of your skill here is overinflated. Even if it isn't, and you really are that good, everyone cannot be in the top 0.00001% of all programmers out there, so your suggestion to "simply" learn not to make mistakes is useless.
This all just comes off incredibly arrogant, if I'm being honest.
otterley
I think a more charitable interpretation of what he said was, "after sufficient practice, I became good enough to start avoiding those pitfalls."
It's not all that different from learning any challenging task. I can't skateboard to save my life, but the fact that people can do it well is both admirable and the result of hundreds or thousands of hours of practice.
Skilled people can sometimes forget how long it took to learn their talent, and can occasionally come off as though the act is easy as a result. Don't take it too harshly.
WalterBright
> This all just comes off incredibly arrogant
I know, but it's the truth. Consider another case. I worked in a machine shop in college trying to make my steam engine. The man who ran the shop, was kind enough to show me how to operate the machines.
Those machines were like C, built during WW2. They had no user friendly features. The milling machine was a particular nightmare. It was festooned with levers and gears and wheels to control every aspect. There was no logic to the "user interface". Nothing was labeled. The worst one was a random lever that would reverse the operation of all the other levers. My terror was wrecking the machine by feeding the tool mount into the cutting bit.
I would make a part, and it would come out awful - things like the surface finish was a mess. If he had some time, he'd come over and help me. He'd turn a wheel, "ting" the blade on a grinding wheel, adjust the feed, and make a perfect part (all by eyeball, I might add). The man was just incredible with those machines. I was just in awe. He never made a mistake. Have you ever tried to get a part centered properly in a 4-jaw chuck? Not me. He could on the first try every time.
But he'd been doing it every day for 40 years.
defrost
He's not making a comment about everyone, it's a specific comment about how often long time C programmers make basic mistakes after a million SLOC or so.
In this instance Walter is correct - the mistakes he listed are very rarely made by experienced C programmers, just as ballet dancers rarely trip over their own feet walking down a pavement.
The problem of those errors being commonplace in those that are barely five years in to C coding and still have another five to go before hitting the ten year mark still exists, of course.
But it's a fair point that given enough practice and pain those mistakes go away.
BigJono
You come off as incredibly arrogant too, you just don't realise it because you have the current mainstream opinion and the safety of a crowd.
Do you know how fucking obnoxious it is when 200 people like you come into every thread to tell 10 C or Javascript developers that they can't be trusted with the languages and environments they've been using for decades? There are MILLIONS of successful projects across those two languages, far more than Rust or Typescript. Get a fucking grip.
milesrout
Surely it is better than yet another self-promoting mention of his programming language on every unrelated C, Rust or Zig post?
usamoi
This is actually quite easy to achieve, as long as you cannot realize your own mistakes.
clownpenis_fart
[dead]
eikenberry
Have you looked at Zig? It is often termed a modern C where Rust is the modern C++. Seems like a good fit.
harrison_clarke
i never really understand why these get compared. i wouldn't expect that much overlap in the audiences
zig seems like someone wanted something between C and "the good parts" of C++, with the generations of cruft scrubbed out
rust seems like someone wanted a haskell-flavoured replacement for C++, and memory-safety
i would expect "zig for C++" to look more like D or Carbon than rust. and i'd expect "rust for C" to have memory safety and regions, and probably steal a few ocaml features
elteto
The single best feature (and I would say the _core_ feature separating it from C) that C++ has to offer is RAII and zig does not have that. So I don’t know which good parts of C++ they kept. Zig is more of its own thing, and they take from wherever they like, not just C++.
hyperbrainer
I would say OCaml more than Haskell, but yes.
kelnos
I have, and I do find Zig impressive, but it doesn't go far enough for me. I don't want a "better C", I want a better systems language that can also scale up for other uses.
I like strong, featureful type systems and functional programming; Zig doesn't really fit the bill for me there. Rust is missing a few things I want (like higher-kinded types; GATs don't go far enough for me), but it's incredible how they've managed to build so many zero- and low-cost abstractions and make Rust feel like quite a high-level language sometimes.
rat87
Almost everyone wants to claim to be modern c and not modern C++ because everyone shit on C++
If anyone is the modern c++ its D. They have template metaprogramming while rust has generics and macros.
Rust doesn't have constructors or classes or overloading. I believe its type system is based on Hindley–Milner like ML or Haskell and traits are similar to Haskell type classes. Enums are like tagged Union/sum types in functional programming and rust uses Error/Option types like them. I believe rust macros were inspired by Scheme. And finally a lot of what makes it unique was inspired by Cyclone (an obscure excitemenal language that tried to be a safer C) and other obscure research languages.
I guess rust has RAII that's one major similar to c++. An there's probably some similarities in low level memory access abstraction patterns.
But I'd describe rust as more of an imperative non GC offshoot of MLs then a modern c++ evolution.
jandrewrogers
Rust is not a modern C++, their core models are pretty different. Both Rust and C++ can do things the other can’t do. C++ is a more focused on low-level hyper-optimized systems programming, Rust is a bit higher level and has stronger guardrails but with performance closer to a classic systems language.
I do think Zig is a worthy successor to C and isn’t trying to be C++. I programmed in C for a long time and Zig has a long list of sensible features I wish C had back then. If C had been like Zig I might never have left C.
jpc0
What do you consider the difference in their core models.
Rust and C++ both use RAII, both have a strong emphasis on type safety, Rust just takes that to the extreme.
I would like to even hope both believe in 0 cost abstractions, which contrary to popular belief isn't no cost, but no cost over doing the same thing yourself.
In many cases it's not even 0 cost, it's negative cost since using declarative programming can allow the compiler to optimise in ways you don't know about.
xmcqdpt2
C++ is (according to Bjarne Stroustrup) a general purpose programming language that can be used to build general business software and applications with, not just a systems PL. This is why perf is all over the place —- the core language is fast like C but the stdlib contains terribly slow code (regex, exceptions) and ways to waste lots of cycles (wrapping everything in smart pointers, using std::map instead of unordered map).
BuckRogers
That is very interesting. You have quite the resume too. While I've dabbled in nearly everything, I'm a day to day pro C# developer and I absolutely love it. I've never been upset or had a complaint. If I were forced off for some reason, I'd just go to Typescript. I can't imagine using C. Perhaps with some form of AI valgrind. The problems C solved are just not relevant any longer, and it remains entrenched in 2025. Rust with AI analysis will be amazing to see the results of.
codr7
It looks to me like C is still very relevant, and isn't going anywhere anytime soon.
I realize a lot of people don't want to use it; and that's fine, don't use it.
tartoran
It depends on problem domain. If you're writing kernel level code or device drivers you wouldn't use C# or typescript.
p_ing
There are at least a few kernels, bootloaders, and device drivers written in C# out there… granted for hobby/research.
https://github.com/Michael-K-GH/RoseOS
https://vollragm.github.io/posts/kernelsharp/
https://www.microsoft.com/en-us/research/project/singularity... (asm, C, C++, largely C# (Sing#))
thodg
[dead]
Sysreq2
I will say the more recent additions to C++ at least have solved many of my long standing issues with that C-variant. Most of it was stuff that was long overdue. Like string formatting or a thread safe println. But even some of the stuff I didn’t think I would love has been amazing. Modules. Modules bro. Game changer. I’m all in. Honestly C++ is my go to for anything that isn’t just throw away again. Python will always be king of the single use scripts.
klysm
The problem is that they are _additions_, C++ has such absurd sprawl. The interactions between everything in this massive sprawl is quite difficult to grasp
silisili
That's also a problem in C land, of course, perhaps with less total sprawl.
Yeah, it has new features, but you're stuck working on a C89 codebase, good luck!
I don't know a great answer to that. I almost feel like languages should cut and run at some point and become a new thing.
kelnos
I lost interest in keeping up with C++'s advances more than a decade ago.
The problem is that I want a language where things are safe by default. Many of the newer stuff added in C++ makes things safe, perhaps even to the level of Rust's guarantees -- but that's only if you use only these new things, and never -- even by accident -- use any of the older patterns.
I'd rather just learn a language without all that baggage.
jakogut
Similarly, I went from writing a lot of C to Python, and I appreciate both of them for almost opposite reasons. I ended up finding I like Cython quite a bit, even though the syntax leaves much to be desired. The power, performance, and flexibility of C combined with the batteries included nature of Python is a match made in heaven.
You're also still very much free to write either language purely, and "glue" them together easily using Cython.
lqet
I fully understand that sentiment. For several years now, I have also felt the strong urge to develop something in pure C. My main language is C++, but I have noticed over and over again that I really enjoy using the old C libraries - the interfaces are just so simple and basic, there is no fluff. When I develop methods in pure C, I always enjoy that I can concentrate 100% on algorithmic aspects instead of architectural decisions which I only have to decide on because of the complexity of the language (C++, Rust). To me, C is so attractive because it is so powerful, yet so simple that you can hold all the language features in your head without difficulty.
I also like that C forces me to do stuff myself. It doesn't hide the magic and complexity. Also, my typical experience is that if you have to write your standard data structures on your own, you not only learn much more, but you also quickly see possibly performance improvements for your specific use case, that would have otherwise been hidden below several layers of library abstractions.
This has put me in a strange situation: everyone around me is always trying to use the latest feature of the newest C++ version, while I increasingly try to get rid of C++ features. A typical example I have encountered several times now is people using elaborate setups with std::string_view to avoid string copying, while exactly the same functionality could've been achieved by fewer code, using just a simple raw const char* pointer.
bsenftner
About 16 years ago I started working with a tech company that used "C++ as C", meaning they used a C++ compiler but wrote pretty much everything in C, with the exception of using classes, but more like Python data classes, with no polymorphism or inheritance, only composition. Their classes were not to hide, but to encapsulate. Over time, some C++ features were allowed, like lambdas, but in general we wrote data classed C - and it screamed, it was so fast. We did all our own memory management, yes, using C style mallocs, and the knowledge of what all the memory was doing significantly aided our optimizations, as we targeted to be running with on cache data and code as much as possible. The results were market leading, and the company's facial recognition continually lands in the top 5 algorithms at the annual NIST FR Vendor test.
hliyan
Funnily enough, 16 years ago, I too was in exactly this type of company. C++, using classes, inheritance only for receiving callbacks, most attributes were public (developers were supposed to know how not to piss outside the bowl), pthreads with mutexed in-memory queues for concurrency, no design patterns (yes we used globals instead of Singleton) etc. So blazingly fast we were measuring latencies in sub 100-microseconds. Now, when modern developers say something is "blazingly fast" when it's sub-second, I can only shake my head in disbelief.
bsenftner
Yes, very similar. We had pthreaded mutexed queues too, and measured timings with clock_gettime() with CLOCK_MONOTONIC. Our facial template match runs at 25M compares per second per core, and simply keeping that pipeline fed required all kinds of timing synchronizations.
The only community I know that produces developers that know this kind of stuff intimately are console game programmers, and then only the people responsible for maintaining the FPS at 60. I expect the embedded community knows this too, but is too small for me to know many of them to get a sense of their general technical depth.
zelphirkalt
This however, is an implementation strategy, that most business/"enterprise" developers will fight to the teeth, because they believe, that OOP is the only workable way to create abstractions and that any successful project must have inheritance in it. They never learned another way.
Actually Rust goes a long way towards a similar strategy, by avoiding inheritance and instead relying on structs and traits. That already avoids a lot of BS programming. I am very glad they threw out inheritance and classes. Great design decision right there. I wish FP was made more convenient/possible in Rust though.
zaphirplane
Sounds like they know what they are doing. How is using c++ with only data classes different from using c with struct
relaxing
Namespaces are useful for wrapping disparate bits of C code, to get around namespace collisions during integration.
int_19h
Templates (without crazy metaprogramming stuff) can be a godsend for basic data structures.
porridgeraisin
Slightly better ergonomics I suppose. Member functions versus function pointers come to mind, as do references vs pointers (so you get to use . instead of ->)
drwu
Also fast to compile!
brucehoult
Try doing C with a garbage collector ... it's very liberating.
Do `#include <gc.h>` then just use `GC_malloc()` instead of `malloc()` and never free. And add `-lgc` to linking. It's already there on most systems these days, lots of things use it.
You can add some efficiency by `GC_free()` in cases where you're really really sure, but it's entirely optional, and adds a lot of danger. Using `GC_malloc_atomic()` also adds efficiency, especially for large objects, if you know for sure there will be no pointers in that object (e.g. a string, buffer, image etc).
There are weak pointers if you need them. And you can add finalizers for those rare cases where you need to close a file or network connection or something when an object is GCd, rather than knowing programmatically when to do it.
But simply using `GC_malloc()` instead of `malloc()` gets you a long long way.
You can also build Boehm GC as a full transparent `malloc()` replacement, and replacing `operator new()` in C++ too.
enriquto
> Try doing C with a garbage collector ... it's very liberating.
> Do `#include <gc.h>` then just use `GC_malloc()` instead of `malloc()` and never free.
Even more liberating (and dangerous!): do not even malloc, just use variable length-arrays:
void f(float *y, float *x, int n)
{
float t[n]; // temporary array, destroyed at the end of scope
...
}
This style forces you to alloc the memory at the outermost scope where it is visible, which is a nice thing in itself (even if you use malloc).kqr
At first I really liked this idea, but then I realised the size of stack frames is quite limited, isn't it? So this would work for small data but perhaps not big data.
hdrz
C with dynamic arrays and classes? Object pascal says hello…
kokada
I think one of the nice things about C is that since the language was not designed to abstract e.g.: heap is that it is really easy to replace manual memory management with GC or any other approach to manage memory, because most APIs expects to be called with `malloc()` when heap allocation is needed.
I think the only other language that has a similar property is Zig.
irq-1
Odin has this too:
> Odin is a manual memory management based language. This means that Odin programmers must manage their own memory, allocations, and tracking. To aid with memory management, Odin has huge support for custom allocators, especially through the implicit context system.
https://odin-lang.org/docs/overview/#implicit-context-system
DeathArrow
>Try doing C with a garbage collector ... it's very liberating.
Doing that means that I lose some speed and I will have to wait for GC collection.
Then why shouldn't I use C# which is more productive and has libraries and frameworks that comes with batteries included that help me build functionality fast.
I thought that one of the main points of using C is speed.
brucehoult
Every time I've tried it, adding GC to a C program makes it faster. It will increase the RAM usage by a bit, and exactly when pauses happen is less predictable (unless you explicitly trigger a GC in e.g. your main loop), and possibly individual pauses larger, but overall program execution time -- throughput -- increases.
Malloc() and free() aren't free, and in particular free() does a lot more bookkeeping and takes more time than people realise.
hexo
C# has disgusting code style. IWillNotReadCode where variable or (and thats even worse) function/method names LookLikeThis.
dlisboa
Which GC is that you’re using in these examples?
umanwizard
I'm not OP but the most popular C GC is Boehm's: https://www.hboehm.info/gc/
wruza
I also like that C forces me to do stuff myself
I never liked that you have to choose between this and C++ though. C could use some automation, but that's C++ in "C with classes" mode. The sad thing is, you can't convince other people to use this mode, so all you have is either raw C interfaces which you have to wrap yourself, or C++ interfaces which require galaxy brain to fully grasp.
I remember growing really tired of "add member - add initializer - add finalizer - sweep and recheck finalizers" loop. Or calculating lifetime orders in your mind. If you ask which single word my mind associates with C, it will be "routine".
C++ would be amazing if its culture wasn't so obsessed with needless complexity. We had a local joke back then: every C++ programmer writes heaps of C++ code to pretend that the final page of code is not C++.
rossant
I completely agree with this sentiment. That's why I wrote Datoviz [1] almost entirely in C. I use C++ only when necessary, such as when relying on a C++ dependency or working with slightly more complex data structures. But I love C’s simplicity. Without OOP, architectural decisions become straightforward: what data should go in my structs, and what functions do I need? That’s it.
The most inconvenient aspect for me is manual memory management, but it’s not too bad as long as you’re not dealing with text or complex data structures.
hgs3
Agreed. C, Go, Python, and Lua are my go-to languages because of their simplicity. It's unfortunate, but in my opinion, most mainstream languages are needlessly complex.
In my experience, whether it's software architecture or programming language design, it's easy to make things complicated, but it takes vision and discipline to keep them simple.
pansa2
> C, Go, Python, and Lua are my go-to languages because of their simplicity
One of these things is not like the others! Python's complexity has been increasing rapidly (e.g. walrus operator, match statement, increasingly baroque type hints) - has this put you off the language at all?
seanw444
I started programming with Python years and years ago. My journey has led me to preferring low-level languages (currently in a Zig bout), and revisiting Python has been a disappointing experience. Not necessarily bad, but instead of focusing efforts where it really matters, such as eliminating GIL, they seem to be focused on adding new syntactic sugar.
hgs3
Python's complexity has been increasing since Guido stepped away and it is off-putting. The language is still nowhere near as complex as Ruby or others, but I am keeping my eyes open for alternatives.
DeathArrow
I like the idea of using C++ as C. I began disliking OOP, inheritance and encapsulation, heavy usage of GoF patterns and even SOLID. They promise easy to understand, easy to follow, easy to maintain, easy to change, easy to extend code and a good productivity but the effect is contrary, most of the times.
I like functional programming and procedural programming. Fits better to how I think about code. Code is something that takes data and spits data. Code shouldn't be forced into emulating some real life concepts.
chasd00
Most of the embedded world is still C, if you want to write C that's probably the place to find a community.
zafka
I agree with this sentiment. My first gig was telecom, and I wrote in a pascal like language called CHILL, but found out my forte was debugging and patching and ended up doing a fair amount of assembly code that would get applied to live systems. The decade plus I spent in medical devices, I used C and assembly. The thing is, if you own all the code and actually understand what it is supposed to do, you can write safe code.
null
kqr
I started programming with C a long time ago, and even now, every few months, I dream of going back to those roots. It was so simple. You wrote code, you knew roughly which instructions it translated to, and there you went!
Then I try actually going through the motions of writing a production-grade application in C and I realise why I left it behind all those years ago. There's just so much stuff one has to do on one's own, with no support from the computer. So many things that one has to get just right for it to work across edge cases and in the face of adversarial users.
If I had to pick up a low-level language today, it'd likely be Ada. Similar to C, but with much more help from the compiler with all sorts of things.
graycat
When Ada was first announced, I rushed to read about it -- sounded good. But so far, never had access to it.
So, now, after a long time, Ada is starting to catch on???
When Ada was first announced, back then, my favorite language was PL/I, mostly on CP67/CMS, i.e., IBM's first effort at interactive computing with a virtual machine on an IBM 360 instruction set. Wrote a little code to illustrate digital Fourier calculations, digital filtering, and power spectral estimation (statistics from the book by Blackman and Tukey). Showed the work to a Navy guy at the JHU/APL and, thus, got "sole source" on a bid for some such software. Later wrote some more PL/I to have 'compatible' replacements for three of the routines in the IBM SSP (scientific subroutine package) -- converted 2 from O(n^2) to O(n log(n)) and the third got better numerical accuracy from some Ford and Fulkerson work. Then wrote some code for the first fleet scheduling at FedEx -- the BOD had been worried that the scheduling would be too difficult, some equity funding was at stake, and my code satisfied the BOD, opened the funding, and saved FedEx. Later wrote some code that saved a big part of IBM's AI software YES/L1. Gee, liked PL/I!
When I started on the FedEx code, was still at Georgetown (teaching computing in the business school and working in the computer center) and in my appartment. So, called the local IBM office and ordered the PL/I Reference, Program Guide, and Execution Logic manuals. Soon they arrived, for free, via a local IBM sales rep highly curious why someone would want those manuals -- sign of something big?
Now? Microsoft's .NET. On Windows, why not??
phicoh
I recently started re-reading "Programming in Ada" by J.G.P. Barnes about the original Ada. In my opinion, it was not that good of a language. Plenty of ways to trigger undefined behavior.
Where C was clearly designed to be a practical language with feedback from implementing an operating system in C. Ada lacked that kind of practical experience. And it shows.
I don't know anything about modern day Ada, but I can see why it didn't catch on in the Unix world.
micronian2
I recall watching a presentation about C++20. During the presentation, the presenter said there were about 163 undefined behaviors in the C language (note: I think it was C99) which implied there were many more in C++ since it’s a much more complex language. Unfortunately, I don’t have a link to that presentation.
You might have heard about the SPARK variant of Ada. I recall reading in an article many years ago that the original version of SPARK was based on Ada83 because it is a very safe language with a lot less undefined behaviors, which is key to trying to statically prove the correctness of a program.
pyjarrett
> Plenty of ways to trigger undefined behavior
I'm curious about this list, because it definitely doesn't seem that way these days. It'd be interesting to see how many of these are still possible now.
pjmlp
> So, now, after a long time, Ada is starting to catch on???
Money and hardware requirements.
Finally there is a mature open source compiler, and our machines are light years beyond those beefy workstations required for Ada compilers in the 1980's.
jancsika
> I started programming with C a long time ago, and even now, every few months, I dream of going back to those roots. It was so simple. You wrote code, you knew roughly which instructions it translated to, and there you went!
Related-- I'm curious what percentage of Rust newbies "fighting the borrow checker" is due to the compiler being insufficiently sophisticated vs. the newbie not realizing they're trying to get Rust to compile a memory error.
MaulingMonkey
I certainly spent most (95%+?) of my "fighting the borrow checker" time writing code I would never try to write in C++. A simple example is strings: I'd spend a lot of time trying to get a &str to work instead of a String::clone, where in equivalent C++ code I'd never use std::string_view over std::string - not because it would be a memory error to do so in my code as it stood, but because it'd be nearly impossible to keep it memory safe with code reviews and C++'s limited static analysis tooling.
This was made all the worse by the fact that I frequently, eventually, succeeded in "winning". I would write unnecessary and unprofiled "micro-optimizations" that I was confident were safe and would remain safe in Rust, that I'd never dare try to maintain in C++.
Eventually I mellowed out and started .clone()ing when I would deep copy in C++. Thus ended my fight with the borrow checker.
wolvesechoes
Not everything can be proved at compile-time, so necessarily Rust is going to complain about things that you know can be done safely in that specific context.
For example some tree structures are famously PITA in Rust. Yes, possible, but PITA nonetheless.
phicoh
If you come from C to Rust to basically have to rewire your brain. There are some corner cases that are wrong in Rust, but mostly you have to get used to a completely new way of thinking about object lifetimes and references to objects.
baq
...and then you come back to your C code and think 'how could I not think of these things'.
saati
> You wrote code, you knew roughly which instructions it translated to, and there you went!
This must have been a very very long time ago, with optimizing compilers you don't really know even if they will emit any instructions.
kqr
On x86-type machines, you still have a decent chance, because the instructions themselves are so complicated and high-level. It's not that C is close to the metal, it's that the metal has come up to nearly the level of C!
I wouldn't dare guess what a compiler does to a RISC target.
(But yes, this was back in the early-to-mid 2000s I think. Whether that is a long time ago I don't know.)
bee_rider
Another way of looking at it (although, I’m not sure if I believe this, haha)—it might be easy to guess what the the C compiler will spit out, for the proprietary bytecode known as “x86.” It is hard to guess what actual machine code (uops) it will be jitted to, when it is actually compiled by the x86 virtual machine.
wholinator2
I'd call it a while ago, but not a long time. Long time to me is more like 70s or 80s. I was born in 1996 so likely I'm biased: "before me=long time". It would be interesting to do a study on that. Give the words, request the years, correlate with birthyear, voila
tempodox
> I wouldn't dare guess what a compiler does to a RISC target.
Just let your C(++) compiler generate assembly on an ARM-64 platform, like Apple Silicon or iOS. Fasten your seat belt.
pjmlp
Yeah, back in the MS-DOS and Amiga glory days when C compilers were dumb, and anyone writing Assembly by hand could easily outperform them.
C source files for demoscene and games were glorified macro assemblers full of inline assembly.
postexitus
Thanks for the reference to Amiga. Every random reference to my beloved computer fills me with joy.
pjmlp
On my circle of friends I was the PC guy, however got to enjoy quite a bit of the Amiga ecosystem, thanks to us being always being around each others, and some incursions into our demoscene attempts.
uecker
C compilers got a lot better though and sanitizers and analyzers can also easily catch a lot of mistakes.
anta40
Don't forget Pascal is still alive.
wruza
From what I remember about Ada, it is basically Pascal for rockets.
int_19h
Only in terms of syntax. Feature-wise it's more like C++ but everything is explicit.
kevin_thibedeau
With operator precedence fixed to not be an annoyance.
sgt
And some call it Boomer Rust, if I recall.
bayindirh
Also, COBOL and FORTRAN. FORTRAN is still being developed and one of the languages supported as first class citizen by MPI.
There's a big cloud of hype at the bleeding edge, but if you dare to look beyond that cloud, there are many boring and well matured technologies doing fine.
m463
> no support from the computer
There are a lot of things that are so USEFUL, but maddening.
C is one. make is another.
They serve a really valid purpose, but because they are stable, they have also not evolved at all.
from your ada example, I love package and package body. C has function prototypes but it is almost meaningless.
everyone seems to think C++ is C grown=up, but I don't really like it. It is more like systemd. People accept it but don't love it.
dan_quixote
> Similar to C, but with much more help from the compiler with all sorts of things.
Is that not the problem rust was created to solve?
kelnos
Rust is more like C++ (though still not really) than like C. Rust is a complete re-imagination of what a systems language could be.
phicoh
My conclusion is that C is not a good basis for what Rust is trying to do. The kind of reliability Rust is trying to provide with almost no runtime overhead requires a much more complex language than C.
kqr
Indeed. I'm still not entirely sure why Rust was created when we have Ada, but if I had to guess it's mainly because Rust has slightly more advanced tricks for safe memory management, and to some degree because Rust has curly braces.
kragen
Ada doesn't attempt to statically exclude data races or aliasing bugs. Rust does. I guess you're calling that "slightly more advanced tricks for safe memory management", which sounds wildly inaccurate to me; those problems aren't usually considered "memory management" at all. Rust also has better error handling, a stronger static type system, a Turing-complete compile-time macro system, and much less verbose code.
jhbadger
Ada use was strongly tied to the US DoD, which put some people off it, for one.
tromp
Here's what kc3 code looks like (taken from [1]):
def route = fn (request) {
if (request.method == GET ||
request.method == HEAD) do
locale = "en"
slash = if Str.ends_with?(request.url, "/") do "" else "/" end
path_html = "./pages#{request.url}#{slash}index.#{locale}.html"
if File.exists?(path_html) do
show_html(path_html, request.url)
else
path_md = "./pages#{request.url}#{slash}index.#{locale}.md"
if File.exists?(path_md) do
show_md(path_md, request.url)
else
path_md = "./pages#{request.url}.#{locale}.md"
if File.exists?(path_md) do
show_md(path_md, request.url)
end
end
end
end
}
[1] https://git.kmx.io/kc3-lang/kc3/_tree/master/httpd/page/app/...cgh
Yeah, I'm not sure a lot of people read the article. This isn't really a back to basics, going back to C, forgoing complexity type of article, but instead it's about developing a new programming language called KC3 to make use of ideas he originally developed in Lisp.
relistan
Maybe early return isn’t allowed in that language, but it sure would make that a heck of a lot easier to read.
jkhdigital
The author mentions being deeply inspired and influenced by Jose Valim; I guess this means (approximately) that KC3 is to C as Elixir is to Erlang?
ManBeardPc
C was my first language and I quickly wrote my first console apps and a small game with Allegro. It feels incredibly simple in some aspects. I wouldn’t want to go back though. The build tools and managing dependencies feels outdated, somehow there is always a problem somewhere. Includes and the macro system feels crude. It’s easy to invoke undefined behavior and only realizing later because a different compiler version or flag now optimizes differently. Zig is my new C, includes a C compiler and I can just import C headers and use it without wrapper. Comptime is awesome. Build tool, dependency management and testing included. Cross compilation is easy. Just looks like a modern version of C. If you can live with a language that is still in development I would strongly suggest to take a look.
Otherwise I use Go if a GC is acceptable and I want a simple language or Rust if I really need performance and safety.
contificate
I sometimes write C recreationally. The real problem I have with it is that it's overly laborious for the boring parts (e.g. spelling out inductive datatypes). If you imagine that a large amount of writing a compiler (or similar) in C amounts to juggling tagged unions (allocating, pattern matching over, etc.), it's very tiring to write the same boilerplate again and again. I've considered writing a generator to alleviate much of the tedium, but haven't bothered to do it yet. I've also considered developing C projects by appealing to an embeddable language for prototyping (like Python, Lua, Scheme, etc.), and then committing the implementation to C after I'm content with it (otherwise, the burden of implementation is simply too high).
It's difficult because I do believe there's an aesthetic appeal in doing certain one-off projects in C: compiled size, speed of compilation, the sense of accomplishment, etc. but a lot of it is just tedious grunt work.
anymouse123456
I've been discovering that the grunt work increases logarithmically with how badly I OO the C.
When I simplify and think in terms of streams, it starts getting nice and tidy.
randomNumber7
Despite what some people religiously think about programming languages, imo C was so successful because it is practical.
Yes it is unsafe and you can do absurd things. But it also doesn't get in the way of just doing what you want to do.
ycuser2
I don't think C was successful. It still is! What other language from the 70s is still under the top 5 languages?
Horffupolde
SQL, Lisp.
dharmab
SQL absolutely. Lisp is not anywhere near top 5, though. https://survey.stackoverflow.co/2024/technology#most-popular...
anta40
If you want to do microcontroller/embedded, I think C it still the overall best choice, supported by vendors. Rust and Ada are probably slowly catching up.
bamboozled
Sounds a bit like perl but at a lower level ?
ThinkBeat
You can certainly do entirely absurd things in Perl. But it is a lot easier / safer work with. You get / can get a wealth of information when you the wrong thing in Perl.
With C segmentation fault is not always easy to pinpoint.
However the tooling for C, with sone if the IDEs out there you can set breakpoints/ walk through the code in a debugger, spot more errors during compile time.
There is a debugger included with Perk but after trying to use it a few times I have given up on it.
Give me C and Visual Studio when I need debugging.
On the positive side, shooting yourself in the foot with C is a common occurrence.
I have never had a segmentation fault in Perl. Nor have I had any problems managing the memory, the garbage collector appears to work well. (at least for my needs)
TinkersW
Eh Segfaults are like the easiest error to debug, they almost always tell you exactly where the problem is.
high_priest
Sounds a bit like JavaScript, but at a tower level?
zerr
No, it's because of Unix and AT&T monopoly.
dboreham
Monopoly of the long distance telephone call market??
relaxing
How was AT&T’s monopoly a driver? It’s not like they forced anyone to use UNIX.
linguae
Ironically, AT&T's monopoly actually helped the adoption of Unix, but not in an exploitative way. In 1956, AT&T was subject to a consent decree by the US government, where AT&T was allowed to maintain its phone monopoly but was not allowed to expand its market to other sectors. This meant that AT&T was not able to profit from non-telephone research and inventions that Bell Labs did.
During Unix's early days, AT&T was still under this decree, meaning that it would not sell Unix like how competitors sold their operating systems. However, AT&T licensed Unix, including its source code, to universities for a nominal fee that covered the cost of media and distribution. UC Berkeley was one of the universities that purchased a Unix licenses, and researchers there started making additions to AT&T Unix which were distributed under the name Berkeley Software Distribution (this is where BSD came from). There is also a famous book known as The Lions' Book (https://en.wikipedia.org/wiki/A_Commentary_on_the_UNIX_Opera...) that those with access to a Unix license could read to study Unix. Bootleg copies of this book were widely circulated. The fact that university students, researchers, and professors could get access to an operating system (source code included) helped fuel the adoption of Unix, and by extension C.
When the Bell System was broken up in 1984, AT&T still retained Bell Labs and Unix. The breakup of the Bell System also meant that AT&T was no longer subject to the 1956 consent decree, and thus AT&T started marketing and selling Unix as a commercial product. Licensing fees skyrocketed, which led to an effort by BSD developers to replace AT&T code with open-source code, culminating with 4.3BSD Net/2, which is the ancestor of modern BSDs (FreeBSD, NetBSD, OpenBSD). The mid-1980s also saw the Minix and GNU projects. Finally, a certain undergraduate student named Linus Torvalds started work on his kernel in the early 1990s when he was frustrated with how Minix did not take full advantage of his Intel 386 hardware.
Had AT&T never been subject to the 1956 consent decree, it's likely that Unix might not have been widely adopted since AT&T probably wouldn't have granted generous licensing terms to universities.
Sunspark
Without reading the comments here, I read the blog entry first.
After I finished I was puzzled, "what is the author trying to communicate to the reader here?"
As near as I can determine, enough people weren't using the author's program/utility because it was written in a language that hasn't been blessed by the crowd? It is hinted at that there might be issues involving memory consumption.
The author does not write lessons learned or share statistics of user uptake after the re-write.
No new functionality was gained, presumably this exercise was done as practice reps because the author could do it and had time.
No argument was made that the author has seen the light and now only C from this point on.
thodg
[dead]
TurboHaskal
This reads like a cautionary tale about getting nerdsniped, without a happy ending.
wolfspaw
" I was gaining a lot of money with Ruby on Rails
Then, I decided to move to Common Lisp and start gaining less and less money
Then, I decided to move to C and got Nerd Snipped "
Well, atleast he seems more happy xD
C is cool though
CrimsonCape
Yeah, I think every programmer experiences the "I should write a language" moment when the solution to the problem is abstracted to be the language itself.
codr7
I think every programmer should at some point write their own language.
Tractor8626
Nothing make sense.
What is your killer app? What CL has to do with no one running it? What problem you had with garbage collectors? Why is C is the only option? Are you sure all those RCEs because of VMs and containers and not because it all written in C? "There are no security implications of running KC3 code" - are you sure?
FrustratedMonky
Maybe the moral here is learning Lisp made him a better C programmer.
Could he have jumped right into C and had amazing results, if not for the Journey learning Lisp and changing how he thought of programming.
Maybe learning Lisp is how to learn to program. Then other languages become better by virtue of how someone structures the logic.
codr7
I would definitely recommend any programmer to learn both Lisp and C at some point.
bArray
> Virtual machines still suck a lot of CPU and bandwidth for nothing but emulation. Containers in Linux with cgroups are still full of RCE (remote command execution) and priviledge escalation. New ones are discovered each year. The first report I got on those listed 10 or more RCE + PE (remote root on the machine). Remote root can also escape VMs probably also.
A proper virtual machine is extremely difficult to break out of (but it can still happen [1]). Containers are a lot easier to break out of. I virtual machines were more efficient in either CPU or RAM, I would want to use them more, but it's the worst of both.
[1] https://www.zerodayinitiative.com/advisories/ZDI-23-982/
I'm kinda in the opposite camp. After doing a bunch of VB in my tweens and teens, I learned Java, C, and C++ in college, settling on mostly C for personal and professional projects. I became a core developer of Xfce and worked on that for 5 years.
Then I moved into backend development, where I was doing all Java, Scala, and Python. It was... dare I say... easy! Sure, these kinds of languages bring with them other problems, but I loved batteries-included standard libraries, build systems that could automatically fetch dependencies -- and oh my, such huge communities with open-source libraries for nearly anything I could imagine needing. Even if most of the build systems (maven, sbt, gradle, pip, etc.) have lots of rough edges, at least they exist.
Fast forward 12 years, and I find myself getting back in to Xfce. Ugh. C is such a pain in the ass. I keep reinventing wheels, because even if there's a third-party library, most of the time it's not packaged on many of the distros/OSes our users use. Memory leaks, NULL pointer dereferences, use-after-free, data races, terrible concurrency primitives, no tuples, no generics, primitive type system... I hate it.
I've been using Rust for other projects, and despite it being an objectively more difficult language to learn and use, I'm still much more productive in Rust than in C.