Skip to content(if available)orjump to list(if available)

21st Century C++

21st Century C++

280 comments

·February 5, 2025

coffeeaddict1

The C++ Core Guidelines have existed for nearly 10 years now. Despite this, not a single implementation in any of the three major compilers exists that can enforce them. Profiles, which Bjarne et al have had years to work on, will not provide memory safety[0]. The C++ committee, including Bjarne Stroustrup, needs to accept that the language cannot be improved without breaking changes. However, it's already too late. Even if somehow they manage to make changes to the language that enforce memory safety, it will take a decade before the efforts propagate at the compiler level (a case in point is modules being standardised in 2020 but still not ready for use in production in any of the three major compilers).

[0] https://www.circle-lang.org/draft-profiles.html

Animats

> The C++ committee, including Bjarne Stroustrup, needs to accept that the language cannot be improved without breaking changes.

The example in the article starts with "Wow, we have unordered maps now!" Just adding things modern languages have is nice, but doesn't fix the big problems. The basic problem is that you can't throw anything out. The mix of old and new stuff leads to obscure bugs. The new abstractions tend to leak raw pointers, so that old stuff can be called.

C++ is almost unique in having hiding ("abstraction") without safety. That's the big problem.

amluto

I find the unordered_map example rather amusing. C++’s unordered_map is, somewhat infamously, specified in an unwise way. One basically cannot implement it with a modern, high performance hash table for at least two reasons:

1. unordered_map requires some bizarre and not widely useful abilities that mostly preclude hash tables with probing:

https://stackoverflow.com/questions/21518704/how-does-c-stl-...

2. unordered_map has fairly strict iteration and pointer invalidation rules that are largely incompatible with the implementations that turn out to be the fastest. See:

> References and pointers to either key or data stored in the container are only invalidated by erasing that element, even when the corresponding iterator is invalidated.

https://en.cppreference.com/w/cpp/container/unordered_map

And, of course, this is C++, where (despite the best efforts of the “profiles” people), the only way to deal with lifetimes of things in containers is to write the rules in the standards and hope people notice. Rust, in contrast, encodes the rules in the type signatures of the methods, and misuse is deterministically caught by the compiler.

tialaramex

Like std::vector, std::unordered_map also doesn't do a good job on reservation, I've never been entirely sure what to make of that - did they not care? Or is there some subtle reason why what they're doing made sense on the 1980s computers where this was conceived?

For std::vector it apparently just didn't occur to C++ people to provide the correct API, Bjarne Stroustrup claims the only reason to use a reservation API is to prevent reference and iterator invalidation. -shrug-

[std::unordered_map was standardised this century, but, the thing standardised isn't something you'd design this century, it's the data structure you'd have been shown in an undergraduate Data Structures class 40 years ago.]

IshKebab

You absolutely can throw things out, and they have! Checked exceptions, `auto`, and breaking changes to operator== are the two I know of. There were also some minor breaking changes to comparison operators in C++20.

They absolutely could say "in C++26 vector::operator[] will be checked" and add an `.at_unsafe()` method.

They won't though because the whole standards committee still thinks that This Is Fine. In fact the number of "just get good" people in the committee has probably increased - everyone with any brains has run away to Rust (and maybe Zig).

amluto

> auto

It took me several reads to figure out that you probably meant ‘auto’ the storage class specifier. And now I’m wondering whether this was ever anything but a no-op in C++.

TuxSH

> "in C++26 vector::operator[] will be checked"

Every major project in that cares about perf and binary size would disable the option that compiler vendors would obviously provide, like -fno-exceptions.

Rust memory and type system offer stronger guarantees, leading to better optimization of bound checks, AFAIK.

There are more glaring issues to fix, like std::regex performance and so on.

imtringued

"just get good" implies development processes that catch memory and safety bugs. Meaning what they are really saying between the lines is that the minimum cost of C++ development is really high.

Any C++ code without at least unit tests with 100% test coverage on with UB sanitizer etc, must be considered inherently defective and the developer should be flogged for his absurd levels of incompetence.

Then there is also the need for UB aware formal verification. You must define predicates/conditions under which your code is safe and all code paths that call this code must verifiably satisfy the predicates for all calls.

This means you're down to the statically verifiable subset of C++, which includes C++ that performs asserts at runtime, in case the condition cannot be verified at compile time.

How many C++ developers are trained in formal verification? As far as I am aware, they don't exist.

Any C++ developers reading this who haven't at least written unit tests with UB sanitizer for all of their production code should be ashamed of themselves. If this sounds harsh, remember that this is merely the logical conclusion of "just get good".

ephaeton

That explains very well why rust (to me) feels like C++ommitte-designed, thanks for that!

htfy96

While I sort of agree on the complaint, personally I think the best spot of C++ in this ecosystem is still on great backward-compatibility and marginal safety improvements.

I would never expect our 10M+ LOC performance-sensive C++ code base to be formally memory safe, but so far only C++ allowed us to maintain it for 15 years with partial refactor and minimal upgrade pain.

IshKebab

I think at least Go and Java have as good backwards compatibility as C++.

Most languages take backwards compatibility very seriously. It was quite a surprise to me when Python broke so much code with the 3.12 release. I think it's the exception.

thbb123

I don't know about go, but java is pathetic. I have 30 years old c++ programs that work just fine.

However, an application that I had written to be backward compatible with java 1.4, 15 years ago, cannot be compiled today. And I had to make major changes to have it run on anything past java 8, ~10 years ago, I believe.

bigstrat2003

Java has had shit backwards compatibility for as long as I have had to deal with it. Maybe it's better now, but I have not forgotten the days of "you have to use exactly Java 1.4.15 or this app won't work"... with four different apps that each need their own different version of the JRE or they break. The only thing that finally made Java apps tolerable to support was the rise of app virtualization solutions. Before that, it was a nightmare and Java was justly known as "the devil's software" to everyone who had to support it.

menaerus

Language is improving (?), although IME it went besides the point I'm finding new features to be less useful for every day code. I'm perfectly happy with C++17/20 for 99% of the code I write. And keeping the backwards compatibility for most of the real-world software is a feature not a bug, ok? Breaking it would actually make me go away from the language.

pjmlp

Clion, clang tidy and Visual C++ analysers do have partial support for the Core Guidelines, and they can be enforced.

Granted, it is only those that can be machine verified.

Office is using C++20 modules in production, Vulkan also has a modules version.

fooker

>Despite this, not a single implementation in any of the three major compilers exists that can enforce them

Because no one wants it enough to implement it.

richard_todd

I feel like a few decades ago, standards intended to standardize best practices and popular features from compilers in the field. Dreaming up standards that nobody has implemented, like what seems to happen these days, just seems crazy to me.

immibis

It's bottom-up vs top-down design.

lifthrasiir

Or it's better to have other languages besides from C++ for that.

skywal_l

I hoped Sean would open source Circle. It seemed promising, but it's been years and don't see any tangible progress. Maybe I am not looking hard enough?

alexeiz

He's looking to sell Circle. That must be the reason he's not open sourcing it.

fooker

Huh, I guess if that was the motivation all along.

janice1999

I think Carbon is more promising to be honest. They are aiming for something production-ready in 2027.

bluGill

Profiles will not provide perfect memory safety, but they go a long way to making things better. I have 10 million lines of C++. A breaking change (doesn't matter if you call it new C++ or Rust) would cost over a billion dollars - that is not happening. Which is to say I cannot use your perfect solution, I have to deal with what I have today and if profiles can make my code better without costing a full rewrite then I want them.

tialaramex

Changes which re-define the language to have less UB will help you if you want safety/ correctness and are willing to do some work to bring that code to the newer language. An example would be the initialization rules in (draft) C++ 26. Historically C++ was OK with you just forgetting to initialize a primitive before using it, that's Undefined Behaviour in the language so... if that happens too bad all bets are off. In C++ 26 that will be Erroneous Behaviour and there's some value in the variable, it's not always guaranteed to be valid (which can be a problem for say, booleans or pointers) but just looking at the value is no longer UB and if you forgot to initialize say an int, or a char, that's fine since any possible bit sequence is valid, what you did was an error, but it's not necessarily fatal.

If you're not willing to do any work then you're just stuck, nobody can help you, magic "profiles" don't help either.

But, if you're willing to do work, why stop at profiles? Now we're talking about a price and I don't believe that somehow the minimum assignable budget is > $1Bn

bluGill

The first part is why I'm excited for future C++ - they are making things better.

The reason I life profiles is they are not all or nothing. I can put them in new code only, or maybe a single file that I'm willing to take the time to refactor. Or at least so I hope, it remains to be seen if that is how they work out. I've been trying to figure out how to make rust fit in, but std::vector<SomeVirtualInterface> is a real pain to wrap into rust and so far I haven't managed to get anything done there.

The $1 billion is realistic - this project was a rewrite of a previous product that became unmaintainable and inflation adjusted the cost was $1 billion. You can maybe adjust that down a little if we are more productive, but not much. You can adjust it down a lot if you can come up with a way to keep our existing C++ and just extend new features and fix the old code only where it really is a problem. The code we have written in C++98 (because that was all we had in 2010) still compiles with the latest C++23 compiler and since there are no know bugs it isn't worth updating that code to the latest standards even though it would be a lot easier to maintain (which we never do) if we did.

saagarjha

This seems bad actually.

null

[deleted]

wakawaka28

Enforcing style guidelines seems like an issue that should be tackled by non-compiler tools. It is hard enough to make a compiler without rolling in a ton of subjective standards (yes, the core guidelines are subjective!). There are lots of other tools that have partial support for detecting and even fixing code according to various guidelines.

gHosts

It's part of a compiler ecosystem. ie. The front end is shared.

See clang-tidy and clang analyzer for example.

ps: That's what I like most about the core guidelines, they are trying very hard to stick to guidelines (not rules) that pretty much uncontroversially make things safer _and_ can be checked automatically.

They're explicitly walking away from bikeshed paintings like naming conventions and formatting.

wakawaka28

The core guidelines aren't as subjective as other guidelines but they are still subjective. There is plenty of completely sound code out there that violates the core guidelines. Not only are they subjective, but many of them require someone to think about the best way to write the code and whether the unpopular way to write it is actually better.

I know compiler front ends can be and are used to create tooling. The point is, you shouldn't be required to implement some kinds of checking in the course of implementing a compiler. If you use a compiler, you should not be required to do all this analysis every single time you compile (unless it is enforcing an objectively necessary standard, and the cost of running it is negligible).

vr46

Last weekend, I took an old cross-platform app written by somebody else between 1994-2006 in C++ and faffed around with it until it compiled and ran on my modern Mac running 14.x. I upped the CMAKE_CXX_STANDARD to 20, used Clang, and all was good. Actually, the biggest challenge was the shoddy code in the first place, which had nothing to do with its age. After I had it running, Sonar gave me 7,763 issues to fix.

The moral of the story? Backwards compatibility means never leaving your baggage behind.

boris

> [M]any developers use C++ as if it was still the previous millennium. [...] C++ now offers modules that deliver proper modularity.

C++ may offer modules (in fact, it's been offering them since 2020), however, when it comes to their implementation in mainstream C++ compilers, only now things are becoming sort of usable with modules still being a challenge in more complex projects due to compiler bugs in the corner cases.

I think we need to be honest and upfront about this. I've talked to quite a few people who have tried to use modules but were unpleasantly surprised by how rough the experience was.

TinkersW

Ya that is rather disingenuous, modules aren't ready, and likely won't be for another 5 years.

Also they are difficult to switch to, so I would expect very few established projects to bother.

gpderetta

Modules were known to be difficult to implement and difficult to migrate to. If modules are mainstream in 5 years, it would be an excellent result.

pjmlp

Office is one of such established projects.

mindcrime

I was an extreme C++ bigot back in the late 90's, early 2000's. My license plate back then was CPPHACKR[1]. But industry trends and other things took my career in the direction of favoring Java, and I've spent most of the last 20+ years thinking of myself as mainly a "Java guy". But I keep buying new C++ books and I always install the C++ tooling on any new box I build. I tell myself that "one day" I'm going to invest the time to bone up on all the new goodies in C++ since I last touched it, and have another go.

When the heck that day will actually arrive, FSM only knows. The will is sort-of there, but there are just SO many other things competing for my time and attention. :-(

[1]: funny side story about that. For anybody too young to remember just how hot the job market was back then... one day I was sitting stopped at a traffic light in Durham (NC). I'm just minding my own business, waiting for the light to change, when I catch a glimpse out of my side mirror, of somebody on foot, running towards my car. The guy gets right up to my car, and I think I had my window down already anyway. Anyway, the guy gets up to me, panting and out of breath from the run and he's like "Hey, I noticed your license plate and was wondering if you were looking for a new job." About then the light turned green in my direction, and I'm sitting there for a second in just stunned disbelief. This guy got out of his car, ran a few car lengths, to approach a stranger in traffic, to try to recruit him. I wasn't going to sit there and have a conversation with horns honking all around me, so I just yelled "sorry man" and drove off. One of the weirder experiences of my life.

ttul

The programmers on the sound team at the video game company I worked for as an intern in 1998 would always stash a couple of extra void pointers in their classes just in case they needed to add something in later. Programmers should never lose sight of pragmatism. Seeking perfection doesn’t help you ship on time. And often, time to completion matters far more than robustness.

OnionBlender

Vulkan does that with `void* pNext` in a lot of its structs so that they can be extended in the future.

ninkendo

Funny, sounds like the Simpsons gag from the same time period: “what’s wrong with this country? Can’t a man walk down the street without being offered a job?”

https://youtube.com/watch?v=yDbvVFffWV4

mindcrime

Interesting. I was SO into the Simpsons at one time, but somehow I'd never seen that episode (as best as I can remember anyway). Now I feel the urge to go back and rewatch every episode of the Simpsons from the beginning. It would be fun, but man, what a time sink. I started the same thing with South Park a while back and stalled out somewhere around Season 5. I'd like to get back to it, but time... time is always against us.

ninkendo

That episode is by far my #1 favorite. Season 8 Episode 2, “You Only Move Twice”, during the period considered by most to be the peak of the Simpsons show quality, and IMO the best episode of the season.

Cypress Creek was intended to be a reference to Silicon Valley and the tech companies there of the time, and it’s got some of the best comedy in the season (Hank Scorpio is the best one-off character ever in the show IMO.)

kylecazar

AIEXPERT here I come!

mindcrime

Awesome! My current tag is /DEV/AGI :-)

mindcrime

Note to the above: I am wrong. My license plate back then was C++HACKR, with the actual "+" signs. NC license plates do allow that, although while the +'s are on the tag, they don't show up on your registration card or in the DMV computer system.

I mixed up the tag and my old domain name, which was "cpphacker.co.uk" (and later, just cpphacker.com/org).

pro14

what is the job market like now for C++ programmers? I'm looking for a job.

tialaramex

Here's how Bjarne describes that first C++ program:

"a simple program that writes every unique line from input to output"

Bjarne does thank more than half a dozen people, including other WG21 members, for reviewing this paper, maybe none of them read this program?

More likely, like Bjarne they didn't notice that this program has Undefined Behaviour for some inputs and that in the real world it doesn't quite do what's advertised.

Maxatar

The collect_lines example won't even compile, it's not valid C++, but there's undefined behavior in one of the examples? I'm very surprised and would like to know what it is, that would be truly shocking.

tialaramex

Really? If you've worked with C++ it shouldn't be shocking.

The first example uses the int type. This is a signed integer type and in practice today it will usually be the 32-bit signed integer Rust calls i32 because that's cheap on almost any hardware you'd actually use for general purpose software.

In C++ this type has Undefined Behaviour if allowed to overflow. For the 32-bit signed integer that will happen once we see 2^31 identical lines.

In practice the observed behaviour will probably be that it treats 2^32 identical lines as equivalent to zero prior occurrences and I've verified that behaviour in a toy system.

Mali-

Bizarre nitpicking - would you rather he used an unbounded integer?

otabdeveloper4

"Undefined behavior" is not a bug. It's something that isn't specified by an ISO standard.

Rust code is 100 percent undefined behavior because Rust doesn't have an ISO standard. So, theoretically some alternative Rust compiler implementation could blow up your computer or steal your bitcoins. There's no ISO standard to forbid them from doing so.

(You see where I'm going with this? Standards are good, but they're a legal construct, not an algorithm.)

notfed

> "Undefined behavior" is not a bug. It's something that isn't specified by an ISO standard.

An ISO standard? According to who, ISO?

otabdeveloper4

Yeah, legal constructs are not actually real and are based on circular logic. (And not just in software, that's a property of legal constructs in general.)

Your point is what?

modernerd

I haven't read much from Bjarne but this is refreshingly self-aware and paints a hopeful path to standardize around "the good parts" of C++.

As a C++ newbie I just don't understand the recommended path I'm supposed to follow, though. It seems to be a mix of "a book of guidelines" and "a package that shows you how you should be using those guidelines via implementation of their principles".

After some digging it looks like the guidebook is the "C++ Core Guidelines":

https://isocpp.github.io/CppCoreGuidelines/CppCoreGuidelines

And I'm supposed to read that and then:

> use parts of the standard library and add a tiny library to make use of the guidelines convenient and efficient (the Guidelines Support Library, GSL).

Which seems to be this (at least Microsoft's implementation):

https://github.com/microsoft/GSL

And I'm left wondering, is this just how C++ is? Can't the language provide tooling for me to better adhere to its guidelines, bake in "blessed" features and deprecate what Bjarne calls, "the use of low-level, inefficient, and error-prone features"? I feel like these are tooling-level issues that compilers and linters and updated language versions could do more to solve.

bb88

The problem with 45 years of C++ is that different eras used different features. If you have 3 million lines of C++ code written in the 1990's that still compiles and works today, should you use new 202x C++ features?

I still feel the sting of being bit by C++ features from the 1990s that turned out to be footguns.

Honestly, I kinda like the idea of "wrapper" languages. Typescript/Kotlin/Carbon.

fuzztester

>footguns

I was expecting that someone would have posted this by now:

How to Shoot Yourself In the Foot:

https://www-users.york.ac.uk/~ss44/joke/foot.htm

kstrauser

I'm curious about that now, too. Is there the equivalent of Python's ruff or Rust's cargo clippy that can call out code that is legal and well-formed but could be better expressed another way?

bluGill

Clang-tidy can rewrite some old code to better. However there is a lot of working code from the 1990s that cannot be automatically rewritten to a new style. Which is what makes adding tooling hard - somehow you need to figure out what code should follow the new style and what is the old style and updating to modern would be too expensive.

lenkite

> As a C++ newbie I just don't understand the recommended path I'm supposed to follow, though

Did you even read the article ? He has given the recommended path in the article itself.

Two books describe C++ following these guidelines except when illustrating errors: “A tour of C++” for experienced programmers and “Programming: Principles and Practice using C++” for novices. Two more books explore aspects of the C++ Core Guidelines

J. Davidson and K. Gregory Beautiful C++: 30 Core Guidelines for Writing Clean, Safe, and Fast Code. 2021. ISBN 978-0137647842

R. Grimm: C++ Core Guidelines Explained. Addison-Wesley. 2022. ISBN 978-0136875673.

einpoklum

> And I'm left wondering, is this just how C++ is? Can't the language provide tooling for me to better adhere to its guidelines

Well, first, the language can't provide tooling: C++ is defined formally, not through tools; and tools are not part of the standard. This is unlike, say, Rust, where IIANM - so far, Rust has been what the Rust compiler accepts.

But it's not just that. C++ design principles/goals include:

* multi-paradigmatism;

* good backwards compatibility;

* "don't pay for what you don't use"

and all of these in combination prevent baking in almost anything: It will either break existing code; or force you to program a certain way, while legitimate alternatives exist; or have some overhead, which you may not want to pay necessarily.

And yet - there are attempts to "square the circle". An example is Herb Sutter's initiative, cppfront, whose approach is to take in an arguably nicer/better/easier/safer syntax, and transpile it into C++ :

https://github.com/hsutter/cppfront/

nialv7

How does enforcing profiles per-translation unit make any sense? Some of these guarantees can only be enforced if assumptions are made about data/references coming from other translation units.

Maxatar

This is the one major stumbling block for profiles right now that people are trying to fix.

C++ code involves numerous templates, and the definition of those templates is almost always in a header file that gets included into a translation unit. If a safety profile is enabled in one translation unit that includes a template, but is omitted from another translation unit that includes that same template... well what exactly gets compiled?

The rule in C++ is that it's okay to have multiple definitions of a declaration if each definition is identical. But if safety profiles exist, this can result in two identical definitions having different semantics.

There is currently no resolution to this issue.

juliangmp

I guess modules are supposed to be the magic solution for that, Bjarne has shown them in this article, even using import std.

Its a bit optimistic cause modules are still not really a viable option in my eyes, because you need proper support from the build systems, and notably cmake only has limited support for them right now.

humanrebar

Modules alone do not guarantee one definition per entity per linked program. On the contrary, build systems are needing to add design complexity to support, for instance, multiple built module interfaces for the std module because different translation units are consuming the std module with different settings -- different standards versions for instance.

jpc0

I've been playing with building out an OpenGL app using C++23 on bleeding edge CMake and Clang and it really is a breath of fresh air... I do run into bugs in both but it is really nice. Most of the bugs are related to import std though which is expected... Oh and clangd(LSP) still having very spotty support for modules.

The tooling is way better than it was 6 months ago though asin I can actually compile code in a non Visual Studio project using import std.

I will be extremely happy the day I no longer need to see a preprocessor directive outside of library code.

hoc

I definitely wouldn't have used "<<" in an "ad" for C++ :)

(I must say that I was happy to see/read that article, though)

DonHopkins

Generalizing Overloading for C++2000

Bjarne Stroustrup, AT&T Labs, Florham Park, NJ, USA

Abstract

This paper outlines the proposal for generalizing the overloading rules for Standard C++ that is expected to become part of the next revision of the standard. The focus is on general ideas rather than technical details (which can be found in AT&T Labs Technical Report no. 42, April 1, 1998).

https://www.stroustrup.com/whitespace98.pdf

jjmarr

Modules sound cool for compile time, but do they prevent duplicative template instantiations? Because that's the real performance killer in my experience.

Maxatar

Modules don't treat templates any differently than non-modules so no, they don't prevent duplicate template instantiations.

senkora

The best way that I know of to do this is the """ "Manual" export templates """ idea discussed here: http://warp.povusers.org/programming/export_templates.html

(It's a great post in general. N.B. that it's also quite old and export templates have been removed from the standard for quite some time after compiler writers refused to implement them.)

TL;DR: Declare your templates in a header, implement them in a source file, and explicitly instantiate them inside that same source file for every type that you want to be able to use them with. You lose expressiveness but gain compilation speed because the template is guaranteed to be compiled exactly once for each instantiation.

jcranmer

You can declare a template in a header file, and only provide its definition (and hence expansion) in a source file. See for example Firefox doing this for its string implementation here: https://searchfox.org/mozilla-central/source/xpcom/string/ns... (extern template declarations are at the end of the header file, and the actual template definitions are in https://searchfox.org/mozilla-central/source/xpcom/string/ns...).

Which is to say, "extern template" is a thing that exists, that works, and can be used to do what you want to do in many cases.

The "export template" feature was removed from the language because only one implementer (EDG) managed to implement them, and in the process discovered that a) this one feature was responsible for all of their schedule misses, b) the feature was far too annoying to actually implement, and c) when actually implemented, it didn't actually solve any of the problems. In short, when they were asked for advice on implementing export, all the engineers unanimously replied: "don't". (See https://www.open-std.org/jtc1/sc22/wg21/docs/papers/2003/n14... for more details).

senkora

> You lose expressiveness

Or more, correctly, the following happens:

1. You gain the ability to use the compilation unit's anonymous namespace instead of a detail namespace, so there is better encapsulation of implementation details. The post author stresses this as the actual benefit of export templates, rather than compile times.

2. You lose the ability to instantiate the template for arbitrary types, so this is probably a no-go for libraries.

3. Your template is guaranteed to be compiled exactly once for each explicit instantiation. (Which was never actually guaranteed for real export templates).

mempko

Bjarne Stroustrup (the creator of C++) is the best language designer. Many language designers will create a language, work on it for a couple years, and then go and make another language. Stroustrup on the other hand has been methodically working on C++ and each year the language becomes better.

mskcc

Prof. Bjarne's commitment to C++ is beyond comparison!

sixthDot

So now even H news are being poluted with IA.

zie1ony

Seeing badly formatted code snippets without color highlighting in article called "21st Century C++" somehow resonates with my opinion on how hard to write and to ready C++ still is after working with other laguages.

AtlasBarfed

This honestly looks like C++ being feature-juryrigged to a degree that it doesn't even look like what C++ is: a c-derived low level language.

Everything is unobvious magic. Sure, you stick to a very restricted set of API usages and patterns, and all the magic allocation/deallocation happens out of sight.

But does that make it easier to debug? Better to code it?

This simply looks like C++ trying not to look like C++: like a completely different language, but one that was not built from the ground up to be that language, rather a bunch of shell games to make it look like another language as an illusion.

DidYaWipe

Yeah, I didn't have a problem keeping my shit straight in C++ in the '90s. The kitchen-sink approach since then hasn't been worth keeping up with. The fact that we're still dealing with header files means that the language stewards' priorities are not in line with practical concerns.