Why Bell Labs Worked
147 comments
·May 11, 2025YouWhy
Bell Labs grew to be a dominant player in an age that was characterized by an oversupply of a manageable number highly capable scientists who did not all have a chance for getting anything resembling funding.
Today we have a huge oversupply of scientists, however there's too many of them to allow judging for potential, and many are not actually capable of dramatic impact.
More generally, a standard critique for "reproducing a golden age" narratives are that the golden age existed within a vastly different ecosystem and indeed - stopped working due to systemic reasons, many of which still apply.
In particular, just blaming 'MBA Management' does little to explain why MBAs appeared in the first place, why they were a preferable alternative to other types of large scale management, and indeed how to avoid relapsing to it over a few years and personnel shifts.
Overall I am afraid this post, while evocative , did not convince me what makes 1517 specifically so different.
pydry
>In particular, just blaming 'MBA Management' does little to explain why MBAs appeared in the first place
Whatever the reason it is definitely not because they are effective managers.
I suspect it's more of a social class phenomenon - they speak the language of the ownership class and that engenders trust.
Gibbon1
My theory is when women and lower class men started working as bookkeepers and accountants in post war America a way was need to keep the plumb jobs reserved for the fail sons of the privilege classes.
I could be wrong but while 'business schools' existed before then the MBA as a upper class ivy league thing exactly dates to that time.
piokoch
Plus there was a lot of "low hanging fruit" from the war time that was yet to be "productify", as we say today.
Radars, computers (Enigma crushers), lasers, many less visible inventions that had a great impact in say, materials science - those barrels had to be durable, planes lighter and faster, etc this allowed do build fancier stuff. Whole nuclear industry and its surrounding.
Another factor: cold war, there was incentive to spend money if only there was some chance to win some advantage.
gsf_emergency
>there's too many of them to allow judging
Agree with this in particular as a good symptom of the "tectonic shifts". I usually blame the Baumol effect, i.e., the increasing difficulty of the inherently human task: keeping science education/science educators up-to-date. Especially when faced with the returns on optimizing more "mechanical" processes (including the impressive short term returns on "merely" improving bean-counting or, in Bell Lab's/IBM's later eras, shifting info-processing away from paper)
I doubt AI or VCs* have any significant role to play in reducing the friction in the college-to-self-selling-breakthrough pipeline, but they should certainly channel most of their efforts to improving the "ecosystem" first
TFA has right ideas such as
>Make sure people talk to each other every day.
Which already happens here on HN! (Although it's mostly different daily sets of people but.. the same old sets of ideas?)
*Not if the main marketable usecase for college students is to game existing metrics. And I don't see no Edtech in the RFS either
scrlk
As an interesting counterpoint to the idea of "just hire smart people and give them a lab", Ralph Gomory, head of IBM Research (a peer of Bell Labs in its day) from 1970-86 said:
> There was a mistaken view that if you just put a lab somewhere, hired a lot of good people, somehow something magical would come out of it for the company, and I didn't believe it. That didn't work. Just doing science in isolation will not in the end, work. [...] It wasn't a good idea just to work on radical things. You can't win on breakthroughs - they're too rare. It just took me years to develop this simple thought: we're always going to work on the in-place technology and make it better, and on the breakthrough technology. [0]
bluGill
Every breakthrough needs many 'man years' of effort to bring to market. research is good but for every researcher we need several thousand who are doing all the hard work of getting the useful things to market in volumn.
leoc
Speaking of which https://substack.com/home/post/p-115930233 :
> John Pierce once said in an interview, asserting the massive importance of development at Bell:
>> You see, out of fourteen people in the Bell Laboratories…only one is in the Research Department, and that’s because pursuing an idea takes, I presume, fourteen times as much effort as having it.
kmeisthax
RCA tried to duplicate Bell Labs' success and it arguably bankrupted the company.
Animats
RCA's Sarnoff Labs produced the image orthicon, NTSC color TV, an early videotape system, an early flat-panel-display, and lots of vacuum tube innovations.[1]
The big business mistakes were in the 1970s, when RCA tried to become a diversified conglomerate. At one point they owned Hertz Rent-A-Car and Banquet TV dinners.[2]
[1] https://eyesofageneration.com/rca-sarnoff-library-photo-arch...
whiplash451
Out of curiosity: is there a write up about this?
OutOfHere
I suspect the MBAs contribute in no small part to the bankruptcy.
pydry
Who ever said that they should be isolated?
The key differentiator was giving them freedom and putting them in charge, not isolating them.
leoc
Eric Gilliam's "How did places like Bell Labs know how to ask the right questions?" https://www.freaktakes.com/p/how-did-places-like-bell-labs-k... came to a similar conclusion. (It did well here just a couple of months ago, too https://news.ycombinator.com/item?id=43295865 , so it is a little disappointing that the discussion seems to be starting from the beginning here again.) Another point which you've both made is that other big US firms had very important industrial research labs, too. (RCA Labs is one that seems to get little love these days, at least outside the pages of We Were Burning https://www.hachettebookgroup.com/titles/bob-johnstone/we-we... Also, to be fair, "Areoform" did mention Xerox PARC once in TFA.) Indeed, overstating the uniqueness of Bell Labs helps to billow up the clouds of mystique, but it's probably harmful to a clear understanding of how it actually worked.
But the ultimate problem with TFA is that it seems to be written to portray venture capitalists(?), or at least this group of VCs who totally get it, as on the side of real innovation along with ... Bell Labs researchers(?) and Bell Labs executives(?) ... against the Permanent Managerial Class which has ruined everything. Such ideas have apparently been popular for a while, but I think we can agree that after the past year or two the joke isn't as funny as it used to be anymore.
smj-edison
Just read that article you mentioned--I find the most interesting part of it is "system integrators," or those who intentionally pay attention to both the research going on, and the on-the-ground problems. It's fascinating how it mentions how they brought information back and forth, and even generated new ideas from all the connections they formed.
whatever1
ExxonMobil also closed this year the NJ based prolific Corporate Strategic Research center that among many chemical process related breakthroughs, identified the CO2 emissions risks (way before academia), and invented the lithium battery.
ab5tract
CO2 emissions risks were already being discussed in the 1800s.
Furthermore, Big Oil notoriously suppressed any hint of their internal climate change models from being published and hired the same marketing firms that Big Tobacco employed.
null
majormajor
You have to be willing to not have things guaranteed to "work." Don't just look at the best case. Investigate and discuss how many versions of Bell Labs didn't "work."
If you just look at the success stories, you could say that today's VC model works great too - see OpenAI's work with LLMs based on tech that was comparatively stagnating inside of Google's labs. Especially if nobody remembers Theranos in 50 years. Or you could say that big government-led projects are "obviously" the way to go (moon landing, internet).
On paper, after all, both the "labs" and the VC game are about trying to fund lots of ideas so that the hits pay for the (far greater) number of failures. But they both, after producing some hits, have run into copycat management optimization culture that brings rapid counter-productive risk-aversion. (The university has also done this with publish-or-perish.)
Victims of their own success.
So either: find a new frontier funding source that hasn't seen that cycle yet (it would be ironic if some crypto tycoon started funding a bunch of pure research and that whole bubble led to fundamental breakthroughs after all, hah) or figure out how to break the human desire for control and guaranteed returns.
LeoPanthera
If you haven't already, check out the "AT&T Archives" on the AT&T Tech Channel on YouTube. It's an absolutely remarkable collection of American technology history.
teleforce
If you want to know the research culture and the environment of Bell Labs from author's first hand experiences, I'd highly recommended this book by Hamming [1].
[1] The Art of Doing Science and Engineering by Richard W. Hamming:
https://press.stripe.com/the-art-of-doing-science-and-engine...
nemild
My dad was at this talk in 1986 that PG shares on his blog:
https://paulgraham.com/hamming.html
Said it was amazing.
atakan_gurkan
Hamming gave that talk many many times. There are recordings of it on YouTube. It is also the final chapter of his book "The Art of Doing Science and Engineering", which, IMHO, is worth reading in its entirety.
badlibrarian
It's been out of stock for nearly a year. Interesting in a post talking about AT&T and Bell Labs to point out that Stripe struggles to maintain an inventory of niche printed books.
nar001
Has it? It seems to be available on Amazon at least.
nemild
You're welcome to borrow my copy, feel free to ping me.
ghaff
I think it's complicated.
A lot of large US tech corporations do have sizable research arms.
Bell Labs is certainly celebrated as part of a telephone monopoly at the time though AT&T actually pulled out of operating system development related to Multics and Unix was pretty much a semi-off-hours project by Ritchie and Thompson.
It's true that you tend not to have such dominant firms as in the past. But companies like Microsoft still have significant research organizations. Maybe head-turning research advancements are harder than they used to be. Don't know. But some large tech firms are still putting lots of money into longer-term advances.
coolcase
Yeah F# and Typescript are very impressive. We just got used to tonnes of innovation. It ain't UNIX but I'd say Typescript is as impressive. An exoskeleton for JS that rivals Haskell.
See also VSCode and WSL.
And if we ain't impressed with LLMs then wtf! I mean maybe it is just nostalga for the old times.
Lots of great stuff is coming out. Quantum computing. Open source revolution producing Tor, Bitcoin, Redis, Linux.
I think we are in the Golden age!
And it is not all from one place. Which is better.
pjmlp
The way SQL Server was ported to Linux for example, makes use of DrawBridge.
.NET and Java also started as research projects, as did GraalVM, Maxime, LLVM, many GHC features, OCaml improvements,....
nine_k
In a way, it's similar to the connection between "boredom" and creativity. When you don't have much to do, you can do anything, including novel and awesome things. It, of course, takes the right kind of person, or a right group of persons. Give such people a way to not think about the daily bread, and allow them to build what they want to build, study what they want to study, think about what they want to think about.
It feels anti-efficient. It looks wasteful. It requires faith in the power of reason and the creative spirit. All these things are hard to pull off in a public corporation, unless it's swimming in excess cash, like AT&T and Google did back in the day.
Notably, a lot of European science in 16-19 centuries was advanced by well-off people who did not need to earn their upkeep, the useless, idle class, as some said. Truth be told, not all of them advanced sciences and arts though.
OTOH the rational, orderly living, when every minute is filled with some predefined meaning, pre-assigned task, allows very little room for creativity, and gives relatively little incentive to invent new things. Some see it as a noble ideal, and, understandably, a fiscal ideal, too.
Maybe a society needs excess sometimes, needs to burn billions on weird stuff, because it gives a chance to to something genuinely new and revolutionary to be born and grow to a viable stage. In a funny way, the same monopolies that gouge prices for the common person also collect the resources necessary for such advances, that benefit that same common person (but not necessarily that same monopoly). It's an unsetllting thought to have.
conception
This is what I think the biggest benefit to having a significant UBI. Sure, lots of folks who currently are in “bullshit jobs” would sit around and watch one screen or another but! A lot, probably more than we imagine, would get bored and… do something. Often that something would be amazing.
But lizard brains gotta keep folks under their thumb and horde resources. Alas.
godelski
> but! A lot, probably more than we imagine, would get bored and… do something.
I'm of the same belief. We're too antsy of creatures. I know in any long vacation I'll spend the first week, maybe even two (!), vegging out doing nothing. But after that I'm itching to do work. I spent 3 months unemployed before heading to college (laid off from work) and in that time taught myself programming, Linux, and other things that are critical to my career today. This seems like a fairly universal experience too! Maybe not the exact tasks, but people needing time to recover and then want to do things.I'm not sure why we think everyone would just veg out WALL-E style and why the idea is so pervasive. Everyone says "well I wouldn't, but /they/ would". I think there's strong evidence that people would do things too. You only have to look at people who retire or the billionaire class. If the people with the greatest ability to check out and do nothing don't, why do we think so many would? People are people after all. And if there's a secret to why some still work, maybe we should really figure that out. Especially as we're now envisioning a future where robots do all the labor.
billy99k
UBI might work in the short-term, but as more and more people are having kids (and learning from parents on UBI, to also get UBI), we would run out of people actually working and paying the taxes to support it.
marcus_holmes
Which is exactly the thing they tested multiple times and found to be wrong.
People get bored doing nothing, and enjoy contributing to their community.
No, they're not going to go get shitty factory jobs. But that's OK, because all those jobs are now automated and done by robots.
But they are going to go and do something useful, because that's what people do. The anti-UBI trope that "given basic income, everyone will just sit around on their arses watching TikTok videos" has been proven wrong in every study that measured it.
conception
This assumes that most people would be satisfied with UBI and not attempt to make more money.
mjevans
UBI isn't going to get us there. Give everyone more cash and the rent-seeking _WILL_ suck harder. Same problem with blindly raising the minimum wage and not instead addressing the root issue.
Basic econ 101: inelastic demand means supply can be as expensive as the limited number who are lucky enough to get it are able to afford.
Bell Labs, generally think tanks, they work by paying _enough_ to raise someone to the capitalist society equivalent of a Noble.
Want to fix the problem for everyone in society, not just an 'intellectual elite'? Gotta regulate the market, put enough supply into it that the price is forced to drop and the average __PURCHASE POWER__ raises even without otherwise raising wages.
nine_k
This has been tried, very honestly, and it mostly sucked, then crashed. The calculation argument [1] kills it. The optimization problem which the market solves in a chaotic and decentralized way through price discovery and trading is intractable otherwise, not with all the computing power of the planet. It also requires prediction of people's needs (ignoring desires), and it's a problem more ill-posed than prediction of weather.
The market of course needs regulation, or, rather, stewardship: from protection of property rights all the way to limiting monopolies, dumping, etc. The market must remain free and varied in order to do its economic work for the benefit of the society. No better mechanism has been invented for last few millennia.
Redistribution to provide a safety net to those in trouble is usually a good thing to have, but it does not require to dismantle the market. It mostly requires an agreement in the society.
[1]: https://en.m.wikipedia.org/wiki/Economic_calculation_problem
fuzzfactor
Concepts like this would definitely be in play and misguided UBI could result more in preservation of status quo than allowing abundance to spread.
That's why experiments need to be made.
Now with research pay Bell was right up there with other prestigious institutions, elite but not like the nobility of old.
I would say very much more like a "Gentleman" scientist of antiquity, whether they were patrons or patronized in some way, they could focus daily on the tasks at hand even when they are some of the most unlikely actions to yield miracles.
Simply because the breakthroughs that are needed are the same as it ever was, and almost no focused tasks lead in that direction ever, so you're going to have to do a lot of "seemingly pointless" stuff to even come up with one good thing. You better get started right away and don't lift your nose from the grindstone either ;)
evidencetamper
> Basic econ 101: inelastic demand means supply can be as expensive as the limited number who are lucky enough to get it are able to afford.
In the same basic econ 101, you learn that real estate demand is localized. UBI allows folks to move to middle of nowhere Montana.
throwaway2037
> Notably, a lot of European science in 16-19 centuries was advanced by well-off people who did not need to earn their upkeep, the useless, idle class, as some said.
I heard a recent interview with John Carmack (of DOOM fame) who described his current style of work as "citizen scientist", where he has enough money, but wants to do independent research on AI/ML. I am always surprised that we don't see more former/retired hackers (whom many got rich from a DotCom), decide to "return to the cave" to do something exciting with open source software. Good counterexamples: (1) Mitchell Hashimoto and his Ghostty, (2) Philip Hazel and his PCRE (Perl Compatible Regular Expressions) library. When I retire (early -- if all things go well), the only way to that I can possibly stave off a certain, early death from intellectual inactivity would be something similar. (Laughably: I don't have 1% of the talents that John Carmack has... but a person can try!)90s_dev
It's not about excess.
Look at some of the most famous success stories in comedy, art, music, theatre, film, etc.
A good number of them did their best work when they were poor.
"Community" is a great example. Best show ever made, hands down. Yet they were all relatively broke and overworked during the whole thing.
It's because they believed in the vision.
nine_k
Art is materially different from science and technology. Great art is known to emerge from limitations. Art is full of limitations that are self-imposed for that purpose, like the meter and rhyme in poetry, geometry and color in painting, etc. Art is primarily about processing and evoking emotions.
Science requires much more concentration on abstract thinking, loading a much larger context, if you will. It's counterproductive to do it while busy with something else. It overworks you all right, and it demands much more rigor than art.
All revolutionary new technology is initially inefficient, and requires spending a lot of time and money on finding efficient solutions. First electronic computers were terribly unwieldy, expensive, and unreliable. This equally applies to first printing presses, first steam engines, first aircraft, first jet engines, first lasers, first LLMs (arguably still applies). It's really hard to advance technology without spending large amounts of resources without any profit, or a guarantee thereof, for years and years. This requires a large cache of such resources, prepared to be burnt on R&D.
It's investment into far future vs predictable present, VC vs day trading.
90s_dev
Tell that to a professional in the arts.
brightball
That was a great show. The best show ever made, hands down though…is “Chuck”
90s_dev
Of all shows I might dare concede to, Chuck is not in the top 50.
analog31
To some extent, the NSF did that. My graduate education was funded by the NSF, and my research didn't have an obvious practical purpose, except to enable further research.
Today, I'm in a corporate research role, and I'm still given a lot of freedom. I'm also genuinely interested in practical applications and I like developing things that people want to buy, but my ability to do those things owes a lot to the relatively freewheeling days of NSF funding 30+ years ago.
r14c
I know this is going to be an unpopular take, but isn't the idea of socialism that you make a unitary democratic government fill the role of Huge Monopoly Foundation so you can do stuff like fund research labs and be accountable to the public?
nine_k
It's the statist idea. Socialism in practice usually involves regulating the market heavily, or into oblivion altogether, and giving the State a huge redistribution power. See my comment nearby on why such a setup fails to work.
A socialism where the only way to work is to own a part of an enterprise (so no "exploitation"is possible) would likely work much better, and not even require a huge state. It would be rather inflexible though, or mutate back into capitalism as some workers would accumulate larger shares of enterprises.
r14c
Having some kind of default steward for market developments that get so competitive and fundamental that they reach full market saturation is helpful. Under a market system, at that scale, the need for growth starts to motivate companies to cut corners or squeeze their customer base to keep the numbers going up. You either end up pricing everyone out (fixed supply case) or the profit margins get so slim that only a massive conglomerate can break even (insatiable demand case). This is why making fundamental needs and infrastructure into market commodities doesn't work either.
The problem with social democracy is that it still gives capitalists a seat at the table and doesn't address the fundamental issues of empowering market radicalism. Some balance would be nice, but I don't really see that happening.
Apocryphon
Sounds like distributism.
greyw
Hardly. Socialism is about workers/communities owning the means of production. Research labs these days are mostly funded by the public. That's just about allocation of government resources.
godelski
This is what I wished academia would be. I'm finishing my PhD and despite loving teaching and research (I've been told I'd make a good professor, including from students) I just don't see the system doing what it should. Truthfully, I'm not aware of any such environment other than maybe a handful of small groups (both in academia and industry).
I think we've become overly metricized. In an effort to reduce waste we created more. Some things are incredibly hard to measure and I'm not sure why anyone would be surprised that one of those things is research. Especially low level research. You're pushing the bounds of human knowledge. Creating things that did not previously exist! Not only are there lots of "failures", but how do you measure something that doesn't exist?
I write "failure" in quotes because I don't see it that way, and feel like the common framing of failure is even anti scientific. In science we don't often (or ever) directly prove some result but instead disprove other things and narrow down our options. In the same way every unsuccessful result decreases your search space for understanding where the truth is. But the problem is that the solution space is so large and in such a high dimension that you can't effectively measure this. You're exactly right, it looks like waste. But in an effort to "save money" we created a publish or perish paradigm, which has obviously led to many perverse incentives.
I think the biggest crime is that it severely limits creativity. You can't take on risky or even unpopular ideas because you need to publish and that means passing "peer review". This process is relatively new to science though. It didn't exist in the days of old scientists you reference[0]. The peer review process has always been the open conversation about publications, not the publications themselves nor a few random people reading it who have no interest and every reason to dismiss. Those are just a means to communicate, something that is trivial with today's technologies. We should obviously reject works with plagiarism and obvious factual errors, but there's no reason to not publish the rest. Theres no reason we shouldn't be more open than ever[1]. But we can't do this in a world where we're in competition with another. It only works in a world where we're united by the shared pursuit of more knowledge. Otherwise you "lose credit" or some "edge".
And we're really bad at figuring out what's impactful. Critically, the system makes it hard to make paradigm shifts. A paradigm shift requires a significant rethinking of the current process. It's hard to challenge what we know. It's even harder to convince others. Every major shift we've seen first receives major pushback and that makes it extremely difficult to publish in the current environment. I've heard many times "good luck publishing, even if you can prove it". I've also seen many ideas be put on the infinite back burner because despite being confident in the idea and confident in impact it's known that in the time it'd take to get the necessary results you could have several other works published, which matters far more to your career.
Ironically, I think removing these systems will save more money and create more efficient work (you're exactly right!). We have people dedicating their lives to studying certain topics in depth. The truth is that their curiosity highly aligns with what are critical problems. Sometimes you just know and can't articulate it well until you get a bit more into the problem. I'm sure this is something a lot of people here have experienced when writing programs or elsewhere. There's many things that no one gets why you'd do until after it's done, and frequently many will say it's so obvious after seeing it.
I can tell you that I (and a large number of people) would take massive pay cuts if I could just be paid to do unconditional research. I don't care about money, I care about learning more and solving these hard puzzles.
I'd also make a large wager that this would generate a lot of wealth for a company big enough to do such a program and a lot of value to the world if academia supported this.
(I also do not think the core ideas here are unique to academia. I think we've done similar things in industry. But given the specific topic it makes more sense to discuss the academic side)
[0] I know someone is going to google oldest journal and find an example. The thing is that this was not the normal procedure. Many journals, even in the 20th century, would publish anything void of obvious error.
[1] put on open review. Include code, data, and anything else. Make comments public. Show revisions. Don't let those that plagiarize just silently get rejected and try their luck elsewhere (a surprisingly common problem)
sien
Currently the OECD average spending on R&D is ~2%. Let's say half of that is government spending.
The OECD's total GDP per year is ~50 trillion. So 1 percent is roughly 500 Bn on research.
So there clearly has to be some accountability. But no doubt it could be improved. As you say publishing everything these days makes more sense with platforms like arXiv.
With taking pay cuts to do research, have you ever seen places offer part time work for something and then allow people to research what they want in the other time ?
Or researchers just doing this with other jobs ?
Ha. Hmm. I just realised I have a cousin who does this.
null
kevmo314
> The reason why we don't have Bell Labs is because we're unwilling to do what it takes to create Bell Labs — giving smart people radical freedom and autonomy.
My observation has been that smart people don't want this anymore, at least not within the context of an organization. If you give your employees this freedom, many will take advantage of it and do nothing.
Those that are productive, the smartest who thrive in radical freedom and autonomy, instead choose to work independently. After all, why wouldn't they? If they're putting in the innovation the equity is worth way more than a paycheck.
Unfortunately, that means innovation that requires a Bell Labs isn't as common. Fortunately, one person now can accomplish way more than a 1960's engineer could and the frontier of innovation is much broader than it used to be.
I used to agree with the article's thesis but it's been nearly impossible to hire anyone who wants that freedom and autonomy (if you disagree, <username>@gmail.com). I think it's because those people have outgrown the need for an organization.
musicale
> If you give your employees this freedom, many will take advantage of it and do nothing
This was addressed in the article
> Most founders and executives I know balk at this idea. After all, "what's stopping someone from just slacking off?" Kelly would contend that's the wrong question to ask. The right question is, "Why would you expect information theory from someone who needs a babysitter?"
also this hilarious quote from Richard Hamming:
> "You would be surprised Hamming, how much you would know if you worked as hard as [Tukey] did that many years." I simply slunk out of the office!
kevmo314
Yeah, that's the point of my next sentence. Why would someone who comes up with information theory want to give it to an employer?
I think an answer to that was a lot clearer in the 1960's when going from idea to product was much harder.
wbl
"The only secret worth keeping is out: the damn things work".
What products could Shanon have made only knowing information theory? Or CSRO knowing only ODFM solved multipath? Did Bob Metcalf make more money when everyone had Ethernet or if he'd licensed it much more exclusively?
It's very hard for a single fundamental result to be a durable competitive advantage compared to wider licensing on nicer terms. That's particularly true when much else goes into the product.
fuzzfactor
>Why would someone who comes up with information theory want to give it to an employer?
When an employer or occupation provides a fully respectable career for life, that's your job and it's fully respectable to have that be your life's work from that point onward, plus information theory doesn't represent the full 1% of what Shannon had to offer anyway :)
tikhonj
This has not been my experience at all. I worked on a team with substantial autonomy and agency for a few years, and most people—not everyone, sure, but almost—naturally rose to the occasion.
People want to do good work and people want to feel like they're doing good work. If you create an environment where they feel trusted and safe, they will rise to your expectations.
I had way more trouble with people working too hard but with misaligned ideas of what "good" meant—and stepping on each other's toes—than with anyone slacking off. It's easy to work around somebody who is merely ineffectual!
And, sure, a bunch of stuff people tried did not work out. But the things that did more than made up for it. Programming and quantitative modeling are inherently high-leverage activities; unless leadership manages out all the leverage in the name of predictability, the hits are going to more than make to for the flubs.
kevmo314
Doing work on a team isn't really what the article is discussing though. I'm referring to the very research-y skunkworks-style autonomy.
I am well aware that people in companies can work effectively on teams and that people rise to the occasion in that context. If it didn't work, companies wouldn't hire. But that's not what the article is about.
smj-edison
> it's been nearly impossible to hire anyone who wants that freedom and autonomy
Interesting, this is something that I'd love to do! I'm already planning on pursuing custom chip design for molecular simulation, but I don't really want to handle the business side of things. I'd much rather work in a paid lab than get rich and sell it off. Plus, you can do so much more with a team vs being independent.
I was also homeschooled though (unschooling and tjed philosophy) so I've always been picking my own projects. Sometimes I wonder if the lack of generalist researchers comes down to education (another thing I'd love to pursue).
Zorass
“Smart people don’t need organizations anymore.” I get it—going solo is more appealing now than ever. But I can’t help thinking: some things really only happen in a kind of shared magnetic field. Not because you can’t do it alone, but because that moment when another smart person lights you up— that doesn’t happen in solo mode.
kevmo314
Yeah I completely agree. I see it more like the benefits of going solo have eclipsed the benefits of a team in an organization.
I don't think it's a strictly better environment but in many dimensions going solo is now better than any company. I do often long for that shared magnetic field though.
Zorass
[dead]
OutOfHere
Eh. AI can empower the solo worker more than anyone else.
chrsw
Hypothetical slackers didn't stop great work from coming out of the lab. I'm not sure why today would be any different.
jltsiren
Hiring smart people who want freedom and autonomy is easy. Just give them freedom, autonomy, stability, and a good enough salary. The hard part is getting them to contribute to your business. Maybe they will contribute if they find it interesting. But if you expect them to contribute, you are clearly not giving them autonomy.
Many of the smartest people I know are good at ignoring bureaucratic requirements, or at least handling them with the minimum effort necessary. And that often includes business, which many of them see as a subcategory of bureaucracy.
detourdog
The birthed an industry based on electrical properties that were barely understood. They also ended up needing a very dynamic metering and accounting system. Apple can get away with a more unified workforce because their needs are known and not unique.
nine_k
If your needs are known, they are also known to competitors.
I know that Elon Musk is not a popular figure nowadays, but he very correctly stated that competition is for losers, and the real innovators build things that competitors are just unable to copy for a long time, let alone exceed. SpaceX did that. Google, arguably, did that, too, both with their search and their (piecemeal acquired) ad network. Apple did that with iTunes.
Strive to explore the unknown when you can, it may contain yet-unknown lucrative markets.
(There is, of course, an opposite play, the IBM PC play, when you create a market explosion by making a thing open, and enjoy a segment of it, which is larger than the whole market would be if you kept it closed.)
musicale
> I’m so excited about programs like 1517’s Flux that invests $100k in people, no questions asked and lets them explore for a few months without demanding KPIs or instantaneous progress.
If Bell Labs let people xplore for multiple years, a few months probably isn't enough time.
areoform
That's absolutely true! But we aren't a multi-billion dollar corporation with a war chest in the billions so sadly this is the best we can do. :(
It was forcibly funded as part of a consent decree from the US government that allowed AT&T to continue as a monopoly as long as they invested a percent of their yearly revenue (or profit? I forget) in research. AT&T, having no interest in changing their incredibly profitable phone network, then proceeded to do fundamental research, as required as a condition of their monopoly.
Decades later, AT&T was broken up into the baby bells and the consent decree was removed at that time. Bell Labs' fate was then sealed - it no longer had a required legal minimum funding level, and the baby bells were MBA-run monstrosities that were only interested in "research" that paid dividends in the next 6 months in a predictable fashion.
The funding model is an integral part of the story.