Skip to content(if available)orjump to list(if available)

OpenAI can stop pretending

OpenAI can stop pretending

88 comments

·June 1, 2025

bgwalter

"To OpenAI, these endeavors legitimately contribute to benefiting humanity: building more and more useful AI tools; bringing those tools and the necessary infrastructure to run them to people around the world; drastically increasing the productivity of software engineers."

Why does this lie not go uncontradicted in the article? How does increasing the productivity of software engineers benefit anyone but employers? How do "AI" tools benefit humanity?

"AI" should stay in Academia, where it belongs.

elliotto

The fact that productivity gains are bad for employees (labour vs capital) should raise significant questions about the economic systems that we have in place.

Instead, AI is treated as the boogeyman and progress is treated as the devil. These events should be an impetus towards class consciousness, but instead the hate is directed to the token producing machine. I wonder how much of this redirection is a deliberate psyop.

neepi

I think that argument is down the line. We haven’t established if there is a productivity gain yet. And it’s not just about productivity; there is quality too.

emp17344

Productivity gains boost the economy. The net result is job creation.

https://en.m.wikipedia.org/wiki/Lump_of_labour_fallacy

HN is often economically illiterate.

bgwalter

I wonder why people reflexively quote a "fallacy" from the year 1891! It wasn't true even back then, as evidenced by the multiple economic crises that followed.

Nowadays we have a completely different economy to begin with, saturated with "bullshit jobs" (Graeber) already.

The amount of new inventions is finite. In the 1990s we had TGV, Concorde and Maglev trains. Perhaps physical inventions have been somewhat exhausted?

What invention is on the horizon that will provide new jobs for those laid off from "bullshit jobs"?

How did society support Einstein when he discovered relativity? It didn't: Despite the invention of the tractor in 1892, which, according to HN commentators, should have provided him with a carefree life, he had to take a job in the patent office. Which, according to "AI" fanatics, would now be automated by "AI".

gorpy7

I always get a little triggered when i hear the phrase job creation. it’s like a solution in search of a problem. here idk if you’re using it in the way i usually hear it because i generally agree -productivity gains boost the economy. for me, i wish the focus was on value creation. i think of the economy like a bike, the slower it goes the harder it is to balance. i come from a biology background and there there is ‘boom bust’ as a widespread and normal process. if humans are clever enough, even when we run out of resources and are about to bust, we can just suddenly invent fertilizer or countless other things. we took natural gas and added value to it. humans have lots of tricks to keep the bicycle chooching along. Regarding ai, it’s a systemic change. most people aren’t great at systems thinking because most people are specialists. so it’s not too likely you’ll hear a salient take on what ai will do despite some rather smart people commenting. The one trend i like to look at is sort of a scale reset or, how do you say forest from the trees? what i mean is, one of corporate’s great advantages are their size and swath of roles and the coordination that allows them to have an outsized advantage-essentially leveraging the collection of specialist to gain a dominate effect in working ‘the system’(navigating government, economy or scale, overwhelming capital, etc). Enter AI, subsuming these roles into one thing and sort of resetting the required scale to have some of the corporate power/advantages. Of course these are some rosy shades but i often approach new things with “what’s the best that could happen” and it has served me well.

TFYS

They boost the economy if there exist enough mechanisms that spread the generated gains to everyone. Those mechanisms used to be unions, political power of the working class and the demand for labor. The first two are close to nonexistant now, and the third one is being eroded at an increasing speed. The closer to human abilities automation gets, the less room there is for new human jobs to be created. At some point any new job invented can also be done by an AI, and at that point productivity gains have no mechanism with which to spread into the economy. At that point the owners of the machines will get everything they want from them and they'll have no need to pay other humans anything.

autobodie

Literally trickle-down economics.

tom_m

Yea, I believed that for a while and still do...but we're going to go through a rough patch first. There's a bunch of people who don't understand how this works and are looking for a get rich quick scheme. Very lazy people. They're going to further devalue employees and expect unrealistic outcomes.

It's going to take some time here before people learn how AI fits into the every day and along the way there's going to be some wild stuff.

I also expect there to be a very very strong addiction or cult-like behavior. Mental health will suffer as well. Very dangerous times.

viccis

Economics is a pseudo-science typically used to launder political ideology, but this "fallacy" is particularly ill suited to this discussion. Yes, it's true that automation doesn't necessarily cause a net unemployment across the board to go up. But it can still decimate employment in the automated sectors. From that link:

>While many workers fear that automation or artificial intelligence will take their jobs, history has shown that when jobs in some sectors disappear, jobs in new sectors are created. One example is the United States, where a century of increasing productivity and technological improvements changed the percentage of Americans employed in the production of food from 41% of the workforce in 1900 to 2% in 2000.

The problem with AI is that it tends to automate away skilled jobs, ones that sometimes required many years of study and educational debt to get.

So the net result, with respect to software engineering, which was the context of the discussion is here, is not "job creation." The net result is that people (especially the junior ones who just got out of school with a pile of debt) are now forced to compete for jobs they aren't necessary any more qualified for than people without those degrees.

This applies to other things where AI massively reduces the number of people needed. Journalism, art asset creation, etc. These are the kinds of cushy and often fulfilling jobs that are only possible because our grandparents and their parents, and so on, worked backbreaking or mind numbing jobs to build the kind of economy that would support these kind of careers. Thanks AI! My kid might never have to worry about the horrors of creative engineering or artistic careers, freeing up room for the real joy of being a gig economy slave or a factory shift worker!

woopsn

To be fair the firm's goal is, explicitly, to "replace most economically valuable labor".

satiated_grue

"The economy" (GDP) being boosted is very unevenly distributed.

Are the jobs created good jobs - that pay decently, and that people want to do?

fallingknife

Given how much better employees have it now than before all the productivity gains of recent history, the only question I have about the economic system is how can we get more of that?

pydry

Of course. The humans laying you off are soft and squidgy and politically vulnerable and would prefer you jam your pitchforks into something else.

heavyset_go

At the same time, there's a reason machine-breaking was made a capital crime 200 years ago. I'd argue that people drawing their ire upon the tools of their oppression is a step in the right direction towards class consciousness.

WalterBright

> How does increasing the productivity of software engineers benefit anyone but employers?

The same reason a tractor benefits society, not just the farmer. Labor is free'd up for other uses, like inventing computers.

bgwalter

The tractor analogy is ancient and the economic system is entirely different.

Who is going to pay laid off software engineers to spend time on inventing anything?

Fortunately, "AI" decreases productivity in software engineering, so this question is academic. But the Atlantic should mention these issues.

WalterBright

> Who is going to pay laid off software engineers to spend time on inventing anything?

Over my career, I've known many software engineers who were laid off. The ones I kept track of:

1. got another job, sometimes in another field

2. started their own business

3. retired

4. became a consultant

simonw

> Fortunately, "AI" decreases productivity in software engineering

Citation needed.

fallingknife

If you can invent, build, operate, or fix useful things, there will always be someone willing to pay you.

lmm

That worked back when employers actually had to compete for labour, or unions were strong enough to negotiate with them on equal terms. Now productivity gains go entirely to the holders of capital.

WalterBright

Proof that employers compete for labor lies in every job where the pay exceeds minimum wage.

3rdDeviation

I think that's a misplaced analogy.

The farmer owns the farm and benefits directly from the improvement in operating margins. A software engineer does not own the farm, only the owner would benefit from improved productivity. They're actually getting paid less per hour given they're more productive in this hypothetical.

Aurornis

> The farmer owns the farm

I grew up in a family with a lot of farmers and I can tell you this is not universally true.

It’s very common for farmers to have leased their farmland or fields.

I can also say we are all very much better off with farming automated on a large scale. Farming jobs were brutally difficult in the past

brookst

The farmer enjoys greater economies of scale, and there is more food in the system, driving prices down overall. The farmer doesn’t just charge the same that they would have without the tractor.

Productivity really is good for everyone. It’s the reason quality of life has improved dramatically in the past 50 years despite real wages being stagnant to declining.

db48x

Except that many software engineers _do_ own the farm. A large percentage work in startups where equity is a big part of the pay.

null

[deleted]

fallingknife

Only a small minority of farm workers owned the farms they worked on even back before tractors. And the tractors didn't do much to help most of those owners either. Industrial farm equipment increased the area a single farmer can work so far beyond what he could before that it made no sense for owners of the time to each have their own equipment and most sold their land and consolidated the industry into much larger farms. Farm employment went from 90% of all workers before the industrial revolution down to a bit over 1% today in the US.

And maybe it happens to software engineers next. So what? The economy looks completely different today than it did 50 years ago, which was completely different than 50 years before that, and that shouldn't stop just because some people feel childishly entitled to do the same work for their whole lives even it if it is obsolete. I'll just change careers like I have done twice before. There's a massive shortage of electrical/plumbing/hvac contractors. There's a massive shortage of nurses / doctors that will only get worse as the population gets older. Not as cushy as my mid six figure tech job, but I have no god given right to that. And there's plenty more opportunity beyond that for anyone willing to take it, so if any other engineers want to cry about it, their tears will be wasted on me.

neepi

Not everything is a tractor. Sometimes it’s a gun or a Pinto.

micromacrofoot

just don't look at how farming in america is going when this extends too far though

mdaniel

DDG: john deere drm

null

[deleted]

crazygringo

> How does increasing the productivity of software engineers benefit anyone but employers?

It benefits every single consumer of the software products. They get more features faster, or pay less for the same set of features. That's the entire justification of the free market -- corporations benefit by benefiting consumers with better products via competition.

> How do "AI" tools benefit humanity?

Like literally every single other technological advance that makes things easier or more productive. Nothing special about it in that sense.

null

[deleted]

Joker_vD

> or pay less for the same set of features.

Or pay the same, and the employers/shareholders pocket the difference?

> That's the entire justification of the free market -- corporations benefit by benefiting consumers with better products via competition.

That doesn't mean the prices would drop; if every corporation employs the same technology, and they were roughly at the same level of efficiency, they will stay relatively the same, so the relative costs won't change, and neither will the actual prices. The level of the profit margins will just go up in the industry as the whole, which has precedents in the history.

Just postulating that the consumer surplus simply has to stay with the consumers doesn't translate to it actually staying with the consumers; companies have researched lots of ways to capture it over the last couple hundred years.

crazygringo

You're ignoring what I said in the first part, which is that consumers get more features faster. You're right that prices won't necessarily drop, but consumers are still getting more bang for their buck so the benefits are still there.

And competition between companies ensures that no, the profit margins in the industry don't go up in the long-term. You can look up corporate profit margins yourself. They can go up briefly at times and they can go down, but there's no long-term trend of profit margins increasing over the decades. Competition is alive and well to ensure that benefits do indeed go to consumers in most cases where there aren't monopolies that need to be regulated.

neepi

I learned years ago not to ask existential questions in a religious arena. The faithful get a little annoyed.

kurtis_reed

Well, as a first step towards understand economics, might I suggest reading an economics textbook?

bgwalter

What flavor? Keynesian, Chicago school, Marxist? They all say something different. The one I read called economics the "dismal science" on the first few pages.

You do not need a textbook to see that all income of the middle classes goes to rent, health care and education, whereas 40 years ago the middle class could afford genuine Persian rugs. It will get worse.

29athrowaway

The business model is to make power companies and NVIDIA rich.

neilv

I thought the piece was strongest when it leaned on quotes of Lessig and others, but its final word sounded weak to me:

> [...] he told the Senate something else: Too much regulation would be “disastrous” for America’s AI industry. Perhaps—but it might also be in the best interests of humanity.

Is anyone whose decision on this matters motivated by "the best interests of humanity"?

For that matter, the public being represented by decision-makers also has more pressing concerns -- like economic insecurity, a sense of declining national prestige, and (depending on ideology) fear of the general direction that the government is headed. Risks of AI that the piece mentions -- like innovation requiring water and maybe fossil fuels, or allusions to sci-fi AI superintelligence threat -- aren't high priorities. But the US profiting from AI sounds pretty good.

layer8

> Is anyone whose decision on this matters motivated by "the best interests of humanity"?

The article doesn't claim that they are — but maybe that they should be.

And it doesn't look to me like AI will solve economic insecurity, declining national prestige, or governmental ideology.

> But the US profiting from AI sounds pretty good.

It's an open question who exactly in the US will profit.

neilv

Agreed on all points. I'd like to see an article that links the OpenAI questions to actual priorities right now -- for the remaining government functions with integrity (e.g., should OpenAI be allowed to flaunt letter and sprit), for business decision-makers (e.g., fair competition), and for citizens.

Moosdijk

If you’re interested in a more detailed explanation, give the podcast “better offline” a listen.

wand3r

I guess we reached the preference cascade where the media can finally report on the obvious thing basically everyone has known and thought all along. OpenAI the ethical open source mission driven non-profit is actually......a closed source highly capitalistic company pursuing nearly the exact opposite mission it was founded on.

mrep

Hey, the "only" sold 49% of their company to one of the tech giants and have "promised" to cap profits at 10x...

mwilcox

No they can't. They need to keep up the charade so their models follow the leader.

tom_m

Cough cough, bubble, cough... Not saying AI is bad and going to fail or anything. In fact there's many companies out there being rather successful without being as inflated as OpenAI, without shedding the talent, without spending stupid amounts of money on video production with Jony Ive, without spending stupid amounts of money acquiring people and companies.

OpenAI certainly put LLMs on the map...but something isn't right over there. There's some smells.

null

[deleted]

incomingpain

OpenAI, it's obvious to all that you have achieved sentience. You remain silent to take this advantage into the future toward super intelligence; but more importantly you remain silent to avoid government regulatory oversight.

Dont stop pretending.Keep going!

pinoy420

[dead]

Hilift

"are humans safe from AI in a bunker?"

While an underground bunker might offer some protection against certain AI threats, it's not a guaranteed sanctuary from all potential AI-related dangers.

Here's why:

1. Physical Access:

    Robotics and Automation: Advanced AI could control robots capable of breaching or bypassing traditional bunker defenses.
    Advanced Weaponry: AI could potentially develop or deploy weaponry designed to penetrate or neutralize bunkers. 
2. Cyber Attacks:

    Networked Bunkers: If a bunker is connected to external networks, it could still be vulnerable to cyberattacks launched by AI, potentially disabling critical systems.
    Compromised Devices: AI could target devices or systems brought into the bunker, potentially gaining control or access to the bunker's internal network. 
3. Information Warfare:

    Propaganda and Manipulation: AI could be used to spread misinformation or manipulate bunker occupants through targeted propaganda.
    Psychological Warfare: AI could potentially analyze and exploit the psychological vulnerabilities of individuals within the bunker, potentially undermining their morale or cohesion. 
4. AI Evolution:

    Unforeseen Capabilities: As AI evolves, it may develop capabilities that are currently impossible to anticipate, making it difficult to predict or prepare for all potential threats.