The earliest versions of the first C compiler known to exist
173 comments
·March 21, 2025tanelpoder
chasil
UNIX was ported to the System/370 in 1980, but it ran on top of TSS, which I understand was an obscure product.
"Most of the design for implementing the UNIX system for System/370 was done in 1979, and coding was completed in 1980. The first production system, an IBM 3033AP, was installed at the Bell Laboratories facility at Indian Hill in early 1981."
https://web.archive.org/web/20240930232326/https://www.bell-...
jdougan
Interesting. Summer 84/85 (maybe 85/86) I used a port of PCC to System/360 (done, I believe, by Scott Kristjanson) on the University of British Columbia mainframes (Amdahls running MTS). I was working on mail software, so I had to deal with EBCDIC/ASCII issues, which was no fun.
I sometimes wonder if that compiler has survived anywhere.
chasil
z/OS 3.1 is certified for UNIX 95, if this list is correct:
https://www.opengroup.org/openbrand/register/index2.html
That would include a C compiler, but yours is probably on tape somewhere.
Linux has been on this list, courtesy of two Chinese companies.
skissane
> The first publicly available version of Oracle Database (v2 released in 1979) was written in assembly for PDP-11.
I wonder if anybody still has a copy of Oracle v2 or v3?
Oldest I've ever seen on abandonware sites is Oracle 5.1 for DOS
> The mainframes at the time didn't have C compilers
Here's a 1975 Bell Labs memo mentioning that C compilers at the time existed for three machines [0] – PDP-11 UNIX, Honeywell 6000 GCOS, and "OS/370" (which is a bit of a misnomer, I think it actually means OS/VS2 – it mentions TSO on page 15, which rules out OS/VS1)
That said, I totally believe Oracle didn't know about the Bell Labs C compiler, and Bell Labs probably wouldn't share it if they did, and who knows if it had been kept up to date with newer versions of C, etc...
SAS paid Lattice to port their C compiler to MVS and CMS circa 1983/1984, so probably around the same time Oracle was porting Oracle to IBM mainframes – because I take it they also didn't know about or couldn't get access to the Bell Labs compiler
Whereas, Eric Schmidt succeeded in getting Bell Labs to hand over their mainframe C compiler, which was used by the Princeton Unix port, which went on to evolve into Amdahl UTS. So definitely Princeton/Amdahl had a mainframe C compiler long before SAS/Lattice/Oracle did... but maybe they didn't know about it or have access to it either. And even though the original Bell Labs C compiler was for MVS (aka OS/VS2 Release 2–or its predecessor SVS aka OS/VS2 Release 1), its Amdahl descendant may have produced output for Unix only
I assume whatever C compiler AT&T's TSS-based Unix port (UNIX/370) used was also a descendant of the Bell Labs 370 C compiler. But again, it probably produced code only for Unix not for MVS, and probably wasn't available outside of AT&T either
[0] https://archive.org/details/ThePortableCLibrary_May75/page/n...
ggm
I very much doubt anyone from the time wants to talk about it, but there is substantial bad blood about Oracle and Ingres. I believe not all of this story is in the public domain, nor capable of being discussed without lawyers.
dboreham
Writing something that large in assembly is pretty crazy, even in 1979!
saghm
Was it actually that uncommon back then? My understanding is that there were other things (including Unix itself, since it predated C and was only rewritten in it later) written in assembly initially back in the 70s. Maybe Oracle is much larger compared to other things done this way than I realize, or maybe the veneration of Unix history has just been part of my awareness for too long, but for some reason hearing that this happened with Oracle doesn't seem to hit as hard for me as it seems for you. It's possible become so accustomed to something historically significant that I fail to be impressed by a similar feat, but I genuinely thought that assembly was just the language used for stuff low-level for a long time (not that I'm saying there weren't other systems languages besides C, but my recollection is having read that for a while some people were skeptical of the idea of using any high-level language in the place of assembly for systems programming).
acchow
Keep in mind, Oracle was designed to run with 128KB of RAM (no swapping). So it was really tens of thousands of lines, not millions.
ChrisMarshallNY
This is my favorite function :): https://github.com/mortdeus/legacy-cc/blob/936e12cfc756773cb...
arp242
Gotta love the user-friendliness of these old Unix tools:
if (argc<4) {
error("Arg count");
exit(1);
}
Rendello
SQLite error messages are similarly spartan. I wrote a SQLite extension recently and didn't find it difficult to have detailed/dynamic error messages, so it may have just been a preference of the author.
Amlal
Ah, yes, was that because of a lack of inline assembly? I feel like these could be replaced by 'nop' operations.
johnisgood
What is the point of it?
aap_
It's an awkward way to reserve memory. The important detail here is that both compiler phases do this, and the way the programs are linked guarantees that the reserved region has the same address in both phases. Therefore an expression tree involving pointers can be passed to the second phase very succinctly. Not pretty, no, but hardware limitations force you to do come up with strange solutions sometimes.
colejohnson66
Here's the actual code that references the 'ospace' from before 'waste': https://github.com/mortdeus/legacy-cc/blob/936e12cfc756773cb...
johnisgood
Thank you! Is it relevant today at all, or is there an use-case for it today?
fxtentacle
It's an obscure way to statically allocate memory for the ospace pointer.
account42
What's the advantage over an array though, which would allow you to better control the size without making assumptions about code generation.
edit: http://cm.bell-labs.co/who/dmr/primevalC.html (linked from another comment has the answer):
> A second, less noticeable, but astonishing peculiarity is the space allocation: temporary storage is allocated that deliberately overwrites the beginning of the program, smashing its initialization code to save space. The two compilers differ in the details in how they cope with this. In the earlier one, the start is found by naming a function; in the later, the start is simply taken to be 0. This indicates that the first compiler was written before we had a machine with memory mapping, so the origin of the program was not at location 0, whereas by the time of the second, we had a PDP-11 that did provide mapping. (See the Unix History paper). In one of the files (prestruct-c/c10.c) the kludgery is especially evident.
So I guess it has to be a function in order to be placed in front of main() so the buffer can overflow into the no longer needed code at the start of it.
rasjani
Without actually knowing, i'd guess that would generate bytecode's that could be modified later by patching the resulting binary ?
I remember few buddies using similar pattern in ASM that just added n NOP's into code to allow patching and thus eliminating possible recompilation..
ChrisMarshallNY
I suspect that’s it.
There was a lot of self-modification, going on, in those days. Old machine language stuff had very limited resources, so we often modified code, or reused code space.
agumonkey
warm up the stack ? (no idea to be honest)
tanelpoder
The C alternative for the hardware "halt and catch fire" instruction?
kps
The C alternative for the Fortran COMMON block.
keyle
Aside: I was playing with Think C [2] yesterday and macOS 6.0.8 (emulated with Mini vMac [1]).
Boy it took a lot of code to get a window behaving back in the day... And this is a much more modern B/C; it's actually ANSI C but the API is thick.
I did really enjoy the UX of macOS 6 and it's terse look, if you can call it that [3].
[1] https://www.gryphel.com/c/minivmac/start.html
[2] https://archive.org/details/think_c_5
[3] https://miro.medium.com/v2/resize:fit:1024/format:webp/0*S57...
brucehoult
It's much less of your own code if you use TCL (THINK Class Library), which shipped with THINK C 4.0 (and THINK Pascal) in mid 1989.
Your System 6.0.8 is from April 1991, so TCL was well established by then and the C/C++ version in THINK C 5 even used proper C++ features instead of the hand-rolled "OOP in C" (nested structs with function pointers) used by TCL in THINK C 4.
I used TCL for smaller projects, mostly with THINK Pascal which was a bit more natural using Object Pascal, and helped other people use it and transition their own programs that previously used the Toolbox directly, but my more serious programs used MacApp which was released for Object Pascal in 1985, and for C++ in 1991.
keyle
Thanks for this. I was using think C 3.X last night unaware that there is a 5.0. I figured it out as I typed and googled this morning. I will have to revisit the 5.0, and pick up a digitised book.
bluetomcat
Interesting usage of "extern" and "auto". Quite different from contemporary C:
tree() {
extern symbol, block, csym[], ctyp, isn,
peeksym, opdope[], build, error, cp[], cmst[],
space, ospace, cval, ossiz, exit, errflush, cmsiz;
auto op[], opst[20], pp[], prst[20], andflg, o, p, ps, os;
...
Looks like "extern" is used to bring global symbols into function scope. Everything looks to be "int" by default. Some array declarations are specifying a size, others are not. Are the "sizeless" arrays meant to be used as pointers only?fsckboy
>Looks like "extern" is used to bring global symbols into function scope.
a better way to think of extern is, "this symbol is not declared/defined/allocated here, it is declared/defined/allocated someplace else"
"this is its type so your code can reference it properly, and the linker will match up your references with the declared/defined/allocated storage later"
(i'm using reference in the generic english sense, not pointer or anything. it's "that which can give you not only an r-value but an l-value")
Joker_vD
Yes, pretty much. To be fair, C at this point was basically BCPL with slightly different syntax (and better char/string support). The introduction of structs (and then longs) changed it forever.
kragen
BCPL had a lot of features C didn't have at this point and still doesn't. You mean B.
Joker_vD
Could you elaborate on those features? From the top of my head, those are: nested functions — those always were of dubious usefulness compared to the implementation difficulties needed; labels are actual constants, so computed GOTO is available — that's definitely a feature standard C still doesn't have; manifest constants — this one is Ritchie's most baffling omission in the language; multiple assignment — it's not actually parallel so merely a syntax nicety (with a footgun loaded); valof-resultis — while very nice, it's also merely a syntax nicety, "lvalue := valof (... resultis expr; ...)" is the same as "{... lvalue = expr; goto after; ... } after: ;".
What else is there? Pointless distinction between the declaration syntax of functions and procedures?
psjs
What features did BCPL have at that point that C didn’t have (and stil does not)?
xenadu02
"auto" used to mean automatic memory management because if you are coming from assembly or even some other older higher-level languages you can't just declare a local variable and use it as you please. You must declare somewhere to store it and manage its lifetime (even if that means everything is global).
C and its contemporaries introduced automatic or in modern terms local or stack allocated values, often with lexically-scoped lifetimes. extern meaning something outside this file declares the storage for it and register meaning the compiler should keep the value in a register.
However auto has always been the default and thus redundant and style-wise almost no one ever had the style of explicitly specifying auto so it was little-used in the wild. So the C23 committee adopted auto to mean the same as C++: automatically infer the type of the declaration.
You can see some of B's legacy in the design of C. Making everything int by default harkens back to B's lack of types because everything was a machine word you could interpret however you wanted.
Also with original C's function declarations which don't really make sense. The prototype only declares the name and the local function definition then defines (between the closing paren and the opening brace) the list of parameters and their types. There was no attempt whatsoever to have the compiler verify you passed the correct number or types of parameters.
HarHarVeryFunny
Declaring a variable or function as extern(al) just tells the compiler to assume that it is defined "externally", i.e. in another source file. The compiler will generate references to the named variable/function, and the linker will substitute the actual address of the variable/function when linking all the object files together.
Modern C won't let you put extern declarations inside a function like this, basically because it's bad practice and makes the code less readable. You can of course still put them at global scope (e.g. at top of the source file), but better to put them into a header file, with your code organized into modules of paired .h definition and .c implementation files.
netbsdusers
You can do the same with a modren C compiler - the extern and auto mean the same and int is still the default type.
tialaramex
In C23, auto doesn't have a default type, if you write auto without a type then you get the C++ style "type deduction" instead. This is part of the trend (regretted by some WG14 members) of WG14 increasingly serving as a way to fix the core of C++ by instead mutating the C language it's ostensibly based on.
You can think of deduction as crap type inference.
pjmlp
Design by committee, the outcome is usually not what the people on the trenches would like to get.
null
int_19h
All of these things come directly from B:
https://www.nokia.com/bell-labs/about/dennis-m-ritchie/bintr...
dfawcus
As to "sizeless" arrays - yes.
Have a look at the early history of C document on DMR's site, it mentions that the initial syntax for pointers was that form.
ricardo81
Reminds me of the humility every programmer should have, basically we're standing on the shoulders of giants and abstraction for the most part. 80+ years of computer science.
Cool kids may talk about memory safety but ultimately someone had to take care of it, either in their code or abstracted out of it.
pjmlp
Memory safety predates C by a decade, in languages like JOVIAL (1958), ESPOL/NEWP (1961) and PL/I (1964), it follows along in the same decade outside Bell Labs, PL/S(1970), PL.8 (1970), Mesa (1976), Modula-2 (1978).
If anything the cool kids are rediscovering what we lost in systems programming safety due to the wide adoption of C, and its influence in the industry, because the cool kids from 1980's decided memory safety wasn't something worth caring about.
"A consequence of this principle is that every occurrence of every subscript of every subscripted variable was on every occasion checked at run time against both the upper and the lower declared bounds of the array. Many years later we asked our customers whether they wished us to provide an option to switch off these checks in the interests of efficiency on production runs. Unanimously, they urged us not to--they already knew how frequently subscript errors occur on production runs where failure to detect them could be disastrous. I note with fear and horror that even in 1980 language designers and users have not learned this lesson. In any respectable branch of engineering, failure to observe such elementary precautions would have long been against the law."
-- C.A.R Hoare's "The 1980 ACM Turing Award Lecture"
Guess what programming language he is referring to by "1980 language designers and users have not learned this lesson".
estebank
The "cool kids talking about memory safety" are indeed standing on the shoulders of giants, to allow for others to stand even taller.
wang_li
Big non sequitur, but your comment triggered a peeve of mine that I find it ironic when people talk like oldsters can't understand technology.
worik
> ...people talk like oldsters can't understand technology
IMO it is young people that have trouble understanding.
The same mistakes are made over and over, lessons learned long ago are ignored in the present
It easier to write than read, easier to talk than listen, build new than expand the old
bigstrat2003
This is the way of young people in every domain, not just technology. Much like teenagers think they're the first ones to ever have sex before, young people tend to think they are the first ones to notice "hey this status quo really sucks" and try to solve it.
This can be a strength, to be fair - the human mind really does tend to get stuck in a rut based on familiarity, and someone new to the domain can spot solutions that others haven't because of that. But more often, it turns into futile attempts to solve problems while forgetting the lessons of the past.
phito
Understanding one level of abstraction doesn't mean you understand the levels of abstraction built on top of it. And vice versa.
ricardo81
Your comment sounds like a riddle. I've programmed for 25 years but appreciate there's a lot more going on than what I know.
wang_li
Upon my own rereading, it is unclear. My point is that the languages most of us use and the fundamental technologies in the oses we use were designed/invented by people who are in their 80s now, many of the Linux core team are 50-60.
90s_dev
The thing I always loved about C was its simplicity, but in practice it's actually very complex with tons of nuance. Are there any low level languages like C that actually are simple, through and through? I looked into Zig and it seems to approach that simplicity, but I have reservations that I can't quite put my finger on...
steveklabnik
The reality is, the only languages that are truly simple are Turing tarpits, like Brainfuck.
Reality is not simple. Every language that’s used for real work has to deal with reality. It’s about how the language helps you manage complexity, not how complex the language is.
Maybe Forth gets a pass but there’s good reason why it’s effectively used in very limited circumstances.
bluetomcat
The perceived complexity from a semantic standpoint comes from the weakly-typed nature of the language. When the operands of an expression have different types, implicit promotions and conversions take place. This can be avoided by using the appropriate types in the first place. Modern compilers have warning flags that can spot such dodgy conversions.
The rest of the complexity stems from the language being a thin layer over a von Neumann abstract machine. You can mess up your memory freely, and the language doesn’t guarantee anything.
grandempire
C is simple.
Representing computation as words of a fixed bit length, in random access memory, is not (See The Art of Computer Programming). And the extent to which other languages simplify is creating simpler memory models.
mort96
What about C is simple? Its syntax is certainly not simple, it's hard to grok and hard to implement parsers for, and parsing depends on semantic analysis. Its macro system is certainly not simple; implementing a C preprocessor is a huge job in itself, it's much more complex than what appears to be necessary for a macro system or even general text processor. Its semantics are not simple, with complex aliasing rules which just exist as a hacky trade-off between programming flexibility and optimizer implementer freedom.
C forces programs to be simple, because C doesn't offer ways to build powerful abstractions. And as an occasional C programmer, I enjoy that about it. But I don't think it's simple, certainly not from an implementer's perspective.
uecker
First (as in my other comment), the idea that C parsing depends on semantic analysis is wrong (and yes, I wrote C parsers). There are issues which may make implementing C parsers hard if you are not aware of them, but those issues hardly compare to the complexities of other languages, and can easily be dealt with if you know about then. Many people implemented C parsers.
The idea that C does not offer ways to build powerful abstractions is also wrong in my opinion. It basically allows the same abstractions as other languages, but it does not provide as much syntactic sugar. Whether this syntactic sugar really helps or whether it obscures semantics is up to debate. In my opinion (having programmed a lot more C++ in the past), it does not and C is better for building complex applications than C++. I build very complex applications in C myself and some of the most successful software projects were build using C. I find it easier to understand complex applications written in C than in other languages, and I also find it easier to refactor C code which is messed up compared to untangling the mess you can create with other languages. I admit that some people might find it helpful to have the syntactic sugar as help for building abstractions. In C you need to know how to build abstractions yourself based on training or experience.
I see a lot of negativity towards C in recent years, which go against clear evidence, e.g. "you can not build abstractions" or "all C programs segfault all the time" when in reality most of the programs I rely on on a daily basis and which in my experience never crash are written in C.
pjc50
Parsing isn't too bad compared to, say, Perl.
The preprocessor is a classic example of simplicity in the wrong direction: it's simple to implement, and pretty simple to describe, but when actually using it you have to deal with complexity like argument multiple evaluations.
The semantics are a disaster ("undefined behavior").
grandempire
Each of these elements is even worse in every other language I can think of. What language do you think is simple in comparison?
LPisGood
It’s not really clear to me how you could have a simple low level language without tons of nuance. Something like Go is certainly simple without tons of nuance, but it’s not low level, and I think extending it to be low level might add a lot of nuance.
bewo001
forth would come to mind, some people have build surprising stuff with it though I find it too low-level.
fads_go
Lisp is build from a few simple axioms. Would that make it simple?
spauldo
Lisp could be simple... but there's a lot of reasons it isn't.
It uses a different memory model than current hardware, which is optimized for C. While I don't know what goes on under SBCL's hood, the simpler Lisps I'm familiar with usually have a chunk of space for cons cells and a chunk of "vector" space kinda like a heap.
Lisp follows s-expression rules... except when it doesn't. Special forms, macros, and fexprs can basically do anything, and it's up to the programmer to know when sexpr syntax applies and when it doesn't.
Lisp offers simple primitives, but often also very complex functionality as part of the language. Just look at all the crazy stuff that's available in the COMMON-LISP package, for instance. This isn't really all that different than most high level languages, but no one would consider those "simple" either.
Lisp has a habit of using "unusual" practices. Consider Sceme's continuations and use of recursion, for example. Some of those - like first-class functions - have worked their way into modern languages, but image how they would have seemed to a Pascal programmer in 1990.
Finally, Lisp's compiler is way out there. Being able to recompile individual functions during execution is just plain nuts (in a good way). But it's also the reason you have EVAL-WHEN.
All that said, I haven't invested microcontroller Lisps. There may be one or more of those that would qualify as "simple."
kazinator
Mostly we have eval-when because of outdated defaults that are worth re-examining.
A Lisp compiler today should by default evaluate every top level form that are compiles, unless the program opts out of it.
I made the decision in TXR Lisp and it's so much nicer that way.
There are fewer surprises and less need for boilerplate for compile time evaluation control. The most you usually have to do is tell the compiler not to run that form which starts your program: for instance (compile-only (main)). In a big program with many files that could well be the one and only piece of evaluation control for the file compiler.
The downside of evaluating everything is that these definitions sit in the compiler's environment. This pollution would have been a big deal when the entire machine is running a single Lisp image. Today I can spin up a process for the compiling. All those definitions that are not relevant to the compile job go away when that exits. My compiler uses a fraction of the memory of something like GCC, so I don't have to worry that these definitions are taking up space during compilation; i.e. that things which could be written to the object file and then discarded from memory are not being discarded.
Note how when eval-when is used, it's the club sandwich 99% of the time: all three toppings, :compile-toplevel, :load-toplevel, :execute are present. The ergonomics are not very good. There are situations in which it would make sense to only use some of these but they rarely come up.
dwattttt
So are entire branches of mathematics, and I feel safe in saying they are not "simple"
ioma8
I would say rust. When you learn the basics, rust is very simple and will point to you any errors you have, so you get basically no runtime errors. Also the type system is extremely clean, making the code very readable.
But also C itself is very simple language. I do not mean C++, but pure C. I would probably start with this. Yes, you will crash at runtime errors, but besides that its very very simple language, which will give you good understanding of memory allocation, pointers etc.
ForOldHack
Got through C and K&R with no runtime errors, on four platforms, but the first platform... Someone asked the teacher why a struct would not work on Lattice C. The instructor looked at the code, sat down at the students computer, typed in a small program compiled it, and camly put the disks in the box with the manual and threw it in the garbage. "We will have a new compiler next week." We switched to Manx C, which is what we had on the Amiga. Structs worked on MS C, which I thought was the lettuce compiler. ( Apparently a different fork of the portable C compiler, but later they admitted that it was still bigendian years later )
Best programming joke. Teacher said when your code becomes "recalcitrent", we had no idea what he meant. This was in the bottom floor of the library, so on break, we went upstairs and used the dictionary. Recalcitrant means not obeying authority. We laughed out loud, and then went silent. Opps.
The instructor was a commentator on the cryptic-C challenges, and would often say... "That will not do what you think it will do" and then go on and explain why. Wow. We learned a lot about the pre-processor, and more about how to write clean and useful code.
icedchai
Lattice C (on the Amiga) was my first C compiler! Do you remember what the struct issue you ran into? This was a pretty late version... like 5.x.
null
int_19h
Modula-2 is a language operating on the same level (direct memory addressing, no GC etc) but with saner syntax and semantics.
It's still a tad more complicated than it needs to be - e.g. you could drop non-0-based arrays, and perhaps sets and even enums.
FeistySkink
Missed opportunity not calling it LegaC.
smackay
1972 is the answer to the question on the lips of everybody too busy to look at the source files.
jeff_carr
The first 4 commits in GO are:
commit d82b11e4a46307f1f1415024f33263e819c222b8 Author: Brian Kernighan <bwk@research.att.com> Date: Fri Apr 1 02:03:04 1988 -0500
last-minute fix: convert to ANSI C
R=dmr
DELTA=3 (2 added, 0 deleted, 1 changed)
:100644 100644 8626b30633 a689d3644e M src/pkg/debug/macho/testdata/hello.ccommit 0744ac969119db8a0ad3253951d375eb77cfce9e Author: Brian Kernighan <research!bwk> Date: Fri Apr 1 02:02:04 1988 -0500
convert to Draft-Proposed ANSI C
R=dmr
DELTA=5 (2 added, 0 deleted, 3 changed)
:100644 100644 2264d04fbe 8626b30633 M src/pkg/debug/macho/testdata/hello.ccommit 0bb0b61d6a85b2a1a33dcbc418089656f2754d32 Author: Brian Kernighan <bwk> Date: Sun Jan 20 01:02:03 1974 -0400
convert to C
R=dmr
DELTA=6 (0 added, 3 deleted, 3 changed)
:100644 000000 05c4140424 0000000000 D src/pkg/debug/macho/testdata/hello.b
:000000 100644 0000000000 2264d04fbe A src/pkg/debug/macho/testdata/hello.ccommit 7d7c6a97f815e9279d08cfaea7d5efb5e90695a8 Author: Brian Kernighan <bwk> Date: Tue Jul 18 19:05:45 1972 -0500
hello, world
R=ken
DELTA=7 (7 added, 0 deleted, 0 changed)
:000000 100644 0000000000 05c4140424 A src/pkg/debug/macho/testdata/hello.bJoker_vD
Funnily enough, it is emphatically not a single-pass compiler.
dbrower
I don’t think anybody thinks or thought it was.
int_19h
I thought it would be, given that C is designed in such a way that a single pass ought to be sufficient. Single-pass compilers were not uncommon in that era.
Joker_vD
Was it really designed this way? I keep hearing this claim but I don't think Ritchie himself actually confirmed that?
Also, notice how the functions call each other from wherever, even from different files, without need of any forward declarations, it simply works, which, as I have been repeatedly told, is not something a single-pass compiler can implement :)
deweywsu
Am I interpreting this repo correctly? The first C compiler was written in...C?
schindlabua
It would have been bootstrapped in assembly (or B/BCPL?) and then once you can compile enough C to write a C compiler you rewrite your compiler in C.
I remember a Computerphile video where prof. Brailsford said something along the lines of "nobody knew who wrote the first C compiler, everybody just kinda had it and passed it around the office" which I think is funny. There's some sort of analogy to life and things emerging from the primordial soup there, if you squint hard enough.
froh
yes. the Q you're asking is: "how was this bootstrapped?"
the page that's referenced from GitHub doesn't describe that
http://cm.bell-labs.co/who/dmr/primevalC.html
however there probably was a running c compiler (written in assembly) and an assembler and a linker available, hand bootstrapped from machine code, then assembler, linker, then B, NB and then C...
We can't tell but that would make sense...
xenadu02
The first B compiler was written in BCPL on the GE 635 mainframe. Thompson wrote a B compiler in BCPL which they used to cross-compile for PDP-7. Then Thompson rewrote B in B, using the BCPL compiler to bootstrap. AFAIK this is the only clean "bootstrap" step involved in the birth of C (BCPL -> B -> self-compiled B)
Then they tweaked the compiler and called it NB (New B), then eventually tweaked it enough they decided to call it C.
The compiler continuously evolved by compiling new versions of itself through the B -> New B -> C transition. There was no clean cutoff to say "ah this was the first C compiler written in New B".
You can see evidence of this in the "pre-struct" version of the compiler after Ritchie had added structure support but before the compiler itself actually used structs. They compiled that version of the compiler then modified the compiler source to use structs, thus all older versions of the compiler could no longer compile the compiler: https://web.archive.org/web/20140708222735/http://thechangel...
Primeval C: https://web.archive.org/web/20140910102704/http://cm.bell-la...
A modern bootstrapping compiler usually keeps around one or more "simplified" versions of the compiler's source. The simplest one either starts with C or assembly. Phase 0 is compiled or assembled then is used to compile Phase 1, which is used to compile Phase 2.
(Technically if you parsed through all the backup tapes and restored the right versions of old compilers and compiler source you'd have the bootstrap chain for C but no one bothered to do that until decades later).
robertkoss
As someone who has no touchpoints with lower languages at all, can you explain to me why those files are called c01, c02 etc.?
diginova
Also read how a compiler can be written in the same language - https://en.wikipedia.org/wiki/Bootstrapping_%28compilers%29
The first publicly available version of Oracle Database (v2 released in 1979) was written in assembly for PDP-11. Then Oracle rewrote v3 in C (1983) for portability across platforms. The mainframes at the time didn't have C compilers, so instead of writing a mainframe-specific database product in a different language (COBOL?), they just wrote a C compiler for mainframes too.