Urban legend: I think there is a world market for maybe five computers
58 comments
·January 22, 2025WarOnPrivacy
hinkley
Out of the 20 potential customers they pitched to.
So they were counting on a 25% success rate and got 90%.
thmsths
I wonder if it caused any issues. Getting 3 times the amount of orders can be great, but it can also be pyrrhic victory, depending on your ability to deliver.
t0mas88
With this order size there is no public pricing information. So if you can't deliver fast enough you would adjust prices or specifically charge more for the first few delivery slots.
Compare it to ordering very high price items with long lead time like airliners, you pay for a specific delivery slot, not just the item at a random moment. And you can buy options to more deliveries in a specific timeframe, which influences the price of your order.
SoftTalker
At the time (if TFA is correct) the 701 had already existed for a year. So it was only a question of building more of them, not that they had sold 5 vs 18 of something that didn't exist yet. But, they were also most likely on the hook for installing and running them -- at that time a computer like that would have been leased with installation services and on-site operations staff included.
bityard
From a salescritter's perspective, that is frankly not their problem.
moralestapia
No, it did not cause any issues.
tbrownaw
It may be apocryphal, but it's not all that wrong.
Those "about five" computers even have names: AWS, Azure, GCP, ...
ghaff
Sun’s CTO repurposed the quote to make exactly that point in the 2000s, likely before any of those existed. It’s very much an oversimplification but if you squint it’s not totally wrong either.
lysace
Sounds like something Jonathan Schwartz (the ponytailed COO @ Sun at the time, I believe) could have said, did you mean him?
His blog was strangely addictive at the time. Great writer.
ghaff
No, I'm sure it was Greg though I don't think you can get to the Sun blogs any longer. But that's not to say that Jonathan didn't reuse the line himself. (I was an IT industry analyst at the time--and am again.)
Ah, but here's a reference to it: https://www.cnet.com/tech/tech-industry/the-world-needs-only...
I'm guessing it was when Sun started talking up Sun Grid though that part I'm not sure of but the timeframe of Stephen's article pretty much matches.
tw04
I believe his was actually: the network is the computer.
And he was right, he just didn’t anticipate greedy US ISPs would set progress back 2 decades.
ghaff
Oh, please. Pray tell, inform us about how ISPs held back progress for 2 decades. Good broadband access could perhaps have come earlier and cheaper but it basically came soon enough once web-based services were available. I'm not going to argue that US ISPs are universally great but saying that they held back progress by "2 decades" is pretty much ignorant. Especially given that Sun was presumably mostly talking about the context of business computing at the time.
ecshafer
I haven't thought about this much before, but I think it must be a myth. Going from the https://en.wikipedia.org/wiki/History_of_IBM wiki on the history of Wikipedia, there were "Computing machines" in the 30s referring to their calculators and tabulating machines. IBM was already selling more than 5 of these devices, so if the 1943 date was true, it makes no sense. So it referring to a single machine having a market of 5 devices, that might be true.
potato3732842
Yeah, there were so many specialist analog computing machines out there in the 1940s and earlier that the pop culture interpretation of the quote as being "for all digital computers of all types the world over" just doesn't pass the sniff test.
PeterStuer
A litle bit more cloud consolidation and you could argue we're nearly there.
thayne
> Some people question how much of the internet is a place that documents history, and how much of the internet is a place that writes and recreates history.
So, basically the same as things written before the internet existed. It's not like people didn't write down myths and legends on paper, or stone tablets for that matter.
PaulDavisThe1st
> It's not like people didn't write down myths and legends on paper, or stone tablets for that matter.
Across broad swathes of the planet (i.e. all of the Americas), they did not (until very very recently in the overall scheme of human history)
samatman
*most of the Americas.
Analemma_
People really love apocryphal quotes that portray famous or disliked figures as morons. Bill Gates never said "640k ought to be enough for anybody" either, yet that circulates to this day.
mywittyname
And they never pass the smell test.
Clearly the Chairman of IBM in the 50s doesn't believe they will only ever sell 5 computers. What would be the point in investing all of those resources into building anything with that sort of limit?
It's obvious to anyone who understands business and takes a few seconds to consider the quote that he's likely talking about a specific product that is currently priced out of the market. The 5 in that statement is probably sourced from looking at their current clients and seeing A) who could afford such a machine and B) for which of those clients does buying the 701 make clear economic sense? The 20 companies they pitched to probably fell into category A, but they just miscalculated how many of those fell into category B.
The goal was to drive down costs through economies of scale.
ciberado
On the other hand, sometimes se non è vero, è ben trovato. We need stories. Models. The Gandhi we know was not the real one, same for Churchill or any other person, and the same thing happens with some villains. My personal point of view is that apocryphal quotes are just an extension of that mechanism, and can be useful in the construction of our thoughts.
shermantanktop
Sure, but the formula is so clunky. Just take:
A) a person famous for their brilliance or other quality
B) a humdrum everyday failing experienced by regular people, such as hubris, poor ability to predict the future, problems in school, difficulty in relationships, etc.
Mix them together and you get "Einstein flunked math in primary school" and Freud saying "who knows what women want" and other stuff.
I'm all for stories, but these aren't very good ones.
bombcar
Even if Gates never said it, someone involved in the design of DOS decided that 640k would be the demarcation between normal memory and "reserved".
So somebody, somewhere, decided that 640k was "good enough" vs 700k or whatever.
layer8
It was decided by IBM for the IBM PC. The 8088 CPU had 1 MB of addressable memory, and some of it had to be reserved for the BIOS ROM, video hardware, and other expansion cards. So the exact limit was a trade-off between application RAM and hardware expansions. You could also phrase it as “384 KB is enough for BIOS and expansion cards”.
Moreover, 640K is kind of a natural division point in hexadecimal notation. Address segments in the 8088 memory model were 64K, hence segments 00000–90000 were for RAM, and segments A0000–F0000 were for ROM and hardware.
ghaff
There were so many gymnastics to get a bit more memory to work with. Multiple autoexec.bat and config.sys files for many games and the like.
wrs
This happens all the time. Early MacOS put flags in the upper byte of heap pointers, because somebody thought “16MB is enough for anybody”. Physical address extensions had to be added to 32-bit Intel because 4GB turned out to not be “enough for anybody”. Now “64-bit” processors today have 48-bit physical address spaces (or less), but we’ll see…
bitwize
More like someone at IBM. The memory map mapped BIOS, video, and PCjr cartridge memory into the upper memory area. Other contemporaneous, non-IBM-compatible x86 systems of the era could relax this restriction. The Tandy 2000 loaded its BIOS from disk (instead of having it in ROM) and ran MS-DOS (indeed, its BIOS API was IBM-compatible even though hardware wise it was not), and could access up to 768K of user memory flat out (896K with aftermarket expansions). This briefly gave it an advantage handling large spreadsheets and the like.
nonrandomstring
Having a single person to identify with an idea via a quote or famous formula is a very human kind of shorthand we love. People still say "Edison invented the light bulb", even though we know that's not strictly true at all.
People love revisionism in general. Equally we like it when doubt is cast on revered icons to take them down a peg. Or when grand villains and "found to be not as wicked as we once thought". We love treasured theories being overturned. We live in an age on the last frontier of truth, where any controversial claim that throws mud on a cherished belief is popular, just for it's iconoclasm.
Meanwhile I think smart people say "I don't care if it's actually true or not." Did Jesus or Plato or Cicero or Machiavelli actually say that? Who cares? It doesn't matter. If it's a good story that makes a clear point or illustrates an idea, then it's useful. So long as it's not something deeply offensive or unfair to attribute, whether Watson actually said it is irrelevant. It speaks to a more abstract truth about scale and underestimates. It's the kind of fallible thing he would have said... might have said.... maybe should have said! As a famous businessman Watson would no doubt be proud to own that and have it associated with his name, even if slightly erroneously.
bitwize
Today, 4 GiB, which was enough to run an entire university back in the 90s, is what the most rinky-dink Wal-Mart special laptops come with and Windows barely runs at all on that much.
If Bill Gates had said "640k ought to be enough for everybody", at the time he could hardly be blamed for doing so, as single-user desktop machines of the day still typically shipped with 1/10 or 1/5 that much.
rqtwteye
A lot of Einstein quotes are either misquoted and/or not by him. Same for Churchill.
Even the current news likes to pick quotes out of context. Trump says a lot of dumb things but often when I hear the full context of something people are mad about, he didn't say that. I am sure the right wingers do the same.
hatthew
As Abraham Lincoln once said, "Don't believe everything you read on the internet."
fuzztester
the quote about Einstein, relativity and his driver is very cool.
this is one link for it, there are others:
https://www.electronicsweekly.com/blogs/mannerisms/yarns/836...
rqtwteye
Cute story. It even wouldn’t be out of character for him.
schoen
See also https://fakebuddhaquotes.com/ ("I Can't Believe It's Not Buddha!"). A huge genre!
null
HeyLaughingBoy
I can't imagine any salesperson making a statement like that!
bluGill
Sometimes they will - when they are predicting who might buy and thus how much they need to make to hit their numbers. No salesman want to have unmeetable sales goals.
still rare though.
unyttigfjelltol
IBM confirmed they went into a sales cycle in 1953 expecting to sell five units of their first machine, the IBM 701 Electronic Data Processing Machine. We don't know precisely how or to whom this estimate of five units was conveyed beforehand, but the gist of the statement appears likely.
metalman
The way things are going this might be an over estimate, what with the possibility of a space based completely stable billion cubit QPU's, beaming all out data around with lasers, 3 might do it.
bitwize
As I understand it... one of the reasons why the Soviets fell behind in computer technology was because back in the 60s, while Soviet engineers had good designs that were state-of-the-art for the era, the communist economic planners estimated the requirements for computer manufacture to be one per university or government department for a total of maybe a few thousand, while Western manufacturers were getting orders into the tens or hundreds of thousands... and they had to come up with new technologies to produce the machines faster and cheaper in order to keep up, let alone compete with other manufacturers. So the market in the west grew explosively, requiring concomitant growth in innovation, and that put the Soviets on the back foot, requiring them to smuggle in and reverse engineer System/370s, PDPs, etc. in order to stay current.
null
logicalfails
Any good books or sources on this? I would be interested to read more
bschne
+1, wasn’t aware of this, curious to learn more
InvisibleUp
It also didn’t help matters that Stalin was greatly opposed to cybernetics, resulting in no research done on the topic until 1954, the year after he died. And even then, things didn’t really kick off until 1958.
vegabook
See: intel / iphone
amelius
"is", not "will only ever be"
null
TheCoelacanth
The worldwide part is also made up. He wasn't talking about the entire world market, just the companies they tried to sell to on one specific marketing campaign.
Leading contender for actual quote: