Skip to content(if available)orjump to list(if available)

Time saved by AI offset by new work created, study suggests

NalNezumi

I can't find the article anymore but I remember reading almost 10 years ago an article on the economist saying that the result of automation was not removal of jobs but more work + less junior employment positions.

The example they gave was search engine + digital documents removed the junior lawyer headcount by a lot. Prior to digital documents, a fairly common junior lawyer task was: "we have a upcoming court case. Go to the (physical) archive and find past cases relevant to current case. Here's things to check for:" and this task would be assigned to a team of junior (3-10 people). But now one junior with a laptop suffice. As a result the firm can also manage more cases.

Seems like a pretty general pattern.

Balgair

Dwarkesh had a good interview with Zuck the other week. And in it, Zuck had an interesting example (that I'm going to butcher):

FB has long wanted to have a call center for its ~3.5B users. But that call center would automatically be the largest in history and cost ~15B/yr to run. Something that is cost ineffective in the extreme. But, with FB's internal AIs, they're starting to think that a call center may be feasible. Most of the calls are going to be 'I forgot my password' and 'it's broken' anyways. So having a robot guide people along the FAQs in the 50+ languages is perfectly fine for ~90% (Zuck's number here) of the calls. Then, with the harder calls, you can actually route it to a human.

So, to me, this is a great example of how the interaction of new tech and labor is a fractal not a hierarchy. In that, with each new tech that your specific labor sector finds, you get this fractalization of the labor in the end. Zuck would have never thought of a call center, denying the labor of many people. But this new tech allows for a call center that looks a lot like the old one, just with only the hard problems. It's smaller, yes, but it looks the same and yet is slightly different (hence a fractal).

Look, I'm not going to argue that tech is disruptive. But what I am arguing is that tech makes new jobs (most of the time), it's just that these new jobs tend to be dealing with much harder problems. Like, we''re pushing the boundaries here, and that boundary gets more fractal-y, and it's a more niche and harder working environment for your brain. The issue, of course, is that, like a grad student, you have to trust in the person working at the boundary is actually doing work and not just blowing smoke. That issue, the one of trust, I think is the key issue to 'solve'. Cal Newport talks a lot about this now and how these knowledge worker tasks really don't do much for a long time, and then they have these spats of genius. It's a tough one, and not an intellectual enterprise, but an emotional one.

firefoxd

I worked in automated customer support, and I agree with you. By default, we automated 40% of all requests. It becomes harder after that, but not because the problems the next 40% face are any different, but because they are unnecessarily complex.

A customer who wants to track the status of their order will tell you a story about how their niece is visiting from Vermont and they wanted to surprise her for her 16th birthday. It's hard because her parents don't get along as they used to after the divorce, but they are hoping that this will at the very least put a smile on her face.

The AI will classify the message as order tracking correctly, and provide all the tracking info and timeline. But because of the quick response, the customer will write back to say they'd rather talk to a human and ask for a phone number they can call.

The remaining 20% can't be resolved by neither human nor robot.

dweinus

Between the lines, you highlight a tangental issue: execs like Zuckerberg think easy/automatable stuff is 90%. People with skin in the game know it is much less (40% per your estimate).This isn't unique to LLMs. Overestimating the benefit of automation is a time-honored pastime.

exe34

> But because of the quick response, the customer will write back to say they'd rather talk to a human

Is this implying it's because they want to wag their chins?

My experience recently with moving house was that most services I had to call had some problem that the robots didn't address. Fibre was listed as available on the website but then it crashed when I tried "I'm moving home" - turns out it's available in the general area but not available for the specific row of houses (had to talk to a human to figure it out). Water company, I had an account at house N-2, but at N-1 it was included, so the system could not move me from my N-1 address (no water bills) to house N (water bill). Pretty sure there was something about power and council tax too. With the last one I just stopped bothering, figuring that it's the one thing that they would always find me when they're ready (they got in touch eventually).

SoftTalker

Zuck is just bullshitting here, like most of what he says.

There is zero chance he wants to pay even a single person to sit and take calls from users.

He would eliminate every employee at Facebook it it were technically possible to automate what they do.

ivape

He can definitely fire most people at Facebook. He just doesn't because it would be like not providing a simple defense against a pawn move on a Chess board. No point in not matching the opposition's move if you can afford it. They hire, we hire, they fire, we fire.

palmotea

> Most of the calls are going to be 'I forgot my password' and 'it's broken' anyways. So having a robot guide people along the FAQs in the 50+ languages is perfectly fine for ~90% (Zuck's number here) of the calls.

No it isn't. Attempts to do this are why I mash 0 repeatedly and chant "talk to an agent" after being in a phone tree for longer than a minute.

Matthyze

And you don't think that this won't improve with better bots?

exe34

I try to enunciate very clearly: "What would you like to do?" - "Speak to a fcuking human. Speak to a fcuking human. Speak to a fcuking human. Speak to a fcuking human."

wslh

Sorry for the acidity, just training my patience while waiting for the mythical FB/AI call center.

Maxion

As someone who has been involved with customer support (on the in-house tech side) the very vast majority of contacts to a CS team will be very inane or extremely inane. If you can automate away the lowest tier of support with LLMs you'll improve response times for not just the simple questions but also for the hard ones.

Balgair

Yeah, I was a little credulous about what Zuck said there too.

Like, if AI is so good, then it'll just eat away at those jobs and get asymptotically close to 100% of the calls. If it's not that good, then you've got to loop in the product people and figure out why everyone is having a hard time with whatever it is.

Generally, I'd say that calls are just another feedback channel for the product. One that FB has thus far been fine without consulting, so I can't imagine its contribution can be all that high. (Zuck also goes on to talk about the experiments they run on people with FB/Insta/WA, and woah, it is crazy unethical stuff he casually throws out there to Dwarkesh)

Still, to the point here: I'm still seeing Ai mostly as a tool/tech, not something that takes on an agency of it's own. We, the humans, are still the thing that says 'go/do/start', the prime movers (to borrow a long held and false bit of ancient physics). The AIs aren't initiating things, and it seems to a large extent, we're not going to want them to do so. Not out of a sense of doom or lack-of-greed, but simply as we're more interested in working at the edge of the fractal.

toxik

I don't know about lawyering, but with engineering research, I can now ask ChatGPT's Deep Research to do a literature review on any topic. This used to take time and effort.

tom_m

Definitely. When computers came out, jobs increased. When the Internet became widely used, jobs increased. AI is simply another tool.

The sad part is, do you think we'll see this productivity gain as an opportunity to stop the culture of over working? I don't think so. I think people will expect more from others because of AI.

If AI makes employees twice as efficient, do you think companies will decrease working hours or cut their employment in half? I don't think so. It's human nature to want more. If 2 is good, 4 is surely better.

So instead of reducing employment, companies will keep the same number of employees because that's already factored into their budget. Now they get more output to better compete with their competitors. To reduce staff would be to be at a disadvantage.

So why do we hear stories about people being let go? AI is currently a scapegoat for companies that were operating inefficiently and over-hired. It was already going to happen. AI just gave some of these larger tech companies a really good excuse. They weren't exactly going to admit their make a mistake and over-hired, now were they? Nope. AI was the perfect excuse.

As all things, it's cyclical. Hiring will go up again. AI boom will bust. On to the next thing. One thing is for certain though, we all now have a fancy new calculator.

jvanderbot

You either believe that companies are trying to grow as much as possible within their current budget, or not.

Automation is one way to do that.

f_allwein

The Productivity Paradox is officially a thing. Maybe that’s what you’re thinking of?

https://en.m.wikipedia.org/wiki/Productivity_paradox

breppp

Bullshit Jobs, both the article and the subsequent book explore this theme a lot

https://libcom.org/article/phenomenon-bullshit-jobs-david-gr...

PeterStuer

Read the article. The book is just a long boring elongation without new content.

breppp

I do not agree, I think the book is much more interesting than the article. For example the type of jobs such as Box Tickers and Flunkies, as well some really interesting anecdotes

yieldcrv

15 years ago I created my own LLC, work experience from some contracts, and had a friend answer the reference checks

I skipped over junior positions for the most part

I don’t see that not working now

JCM9

Modern AI tools are amazing, but they’re amazing like spell check was amazing when it came out. Does it help with menial tasks? Yes, but it creates a new baseline that everyone has and just moves the bar. Theres scant evidence that we’re all going to just sit on a beach while AI runs your company anytime soon.

There’s little sign of any AI company managing to build something that doesn’t just turn into a new baseline commodity. Most of these AI products are also horribly unprofitable, which is another reality that will need to be faced sooner rather than later.

amarant

It's got me wondering: do any of my hard work actually matter? Or is it all just pointless busy-work invented since the industrial revolution to create jobs for everyone, when in reality we would be fine if like 5% of society worked while the rest slacked off? Don't think we'd have as many videogames, but then again, we would have time to play, which I would argue is more valuable than games.

To paraphrase Lee Iacocca: We must stop and ask ourselves, how much videogames do we really need?

randcraw

> It's got me wondering: do any of my hard work actually matter?

I recently retired from 40 years in software-based R&D and have been wondering the same thing. Wasn't it true that 95% of my life's work was thrown away after a single demo or a disappointingly short period of use?

And I think the answer is yes, but this is just the cost of working in an information economy. Ideas are explored and adopted only until the next idea replaces it or the surrounding business landscape shifts yet again. Unless your job is in building products like houses or hammers (which evolve very slowly or are too expensive to replace), the cost of doing of business today is a short lifetime for any product; they're replaced in increasingly fast cycles, useful only until they're no longer competitive. And this evanescent lifetime is especially the case for virtual products like software.

The essence of software is to prototype an idea for info processing that has utility only until the needs of business change. Prototypes famously don't last, and increasingly today, they no longer live long enough even to work out the bugs before they're replaced with yet another idea and its prototype that serves a new or evolved mission.

Will AI help with this? Only if it speeds up the cycle time or reduces development cost, and both of those have a theoretical minimum, given the time needed to design and review any software product has an irreducible minimum cost. If a human must use the software to implement a business idea then humans must be used to validate the app's utility, and that takes time that can't be diminished beyond some point (just as there's an inescapable need to test new drugs on animals since biology is a black box too complex to be simulated even by AI). Until AI can simulate the user, feedback from the user of new/revised software will remain the choke point on the rate at which new business ideas can be prototyped by software.

zeroonetwothree

I still have code running in production I wrote 20 years ago. Sure, it’s a small fraction, but arguably that’s the whole point.

vemom

Most of a chefs meals are now poo. Memories of those meals survive but eventually they will fade too.

There is a lot of value in being the stepping stone to tomorrow. Not everyone builds a pyramid.

dehrmann

So... to what extent is software a durable good?

hliyan

I've been thinking similarly. Bertrand Russell once said: "there are two types of work. One, moving objects on or close to the surface of the Earth. Two, telling other people to do so". Most of us work in buildings that don't actually manufacture, process or anything. Instead, we process information that describes manufacturing and transport. Or we create information for people to consume when they are not working (entertainment). Only a small faction of human beings are actually producing things that are necessary for physiological survival. Rest of us are at best, helping them optimize that process, or at worst, leeching off of them in the name of "management" of their work.

npteljes

>do any of my hard work actually matter?

Yes... basically in life, you have to find the definition of "to matter" that you can strongly believe in. Otherwise everything feels aimless, the very life itself.

The rest of what you ponder in your comment is the same. And I'd like to add that baselines shifted a lot over the years of civilization. I like to think about one specific example: painkillers. Painkillers were not used during medical procedures in a widespread manner until some 150 years ago, maybe even later. Now, it's much less horrible to participate in those procedures, for everyone involved really, and also the outcomes are better just for this factor - because the patients moves around less while anesthetized.

But even this is up for debate. All in all, it really boils down to what the individual feels like it's a worthy life. Philosophy is not done yet.

amarant

Well, from a societal point of view, meaningful work would be work that is necessary to either maintain or push that baseline.

Perhaps my initial estimate of 5% of the workforce was a bit optimistic, say 20% of current workforce necessary to have food, healthcare, and maybe a few research facilities focused on improving all of the above?

cortesoft

> Don't think we'd have as many videogames, but then again, we would have time to play, which I would argue is more valuable than games.

Would we have fewer video games? If all our basic needs were met and we had a lot of free time, more people might come together to create games together for free.

I mean, look at how much free content (games, stories, videos, etc) is created now, when people have to spend more than half their waking hours working for a living. If people had more free time, some of them would want to make video games, and if they weren’t constrained by having to make money, they would be open source, which would make it even easier for someone else to make their own game based on the work.

jajko

Mine doesn't, and I am fine with that, never needed such validation. I derive fulfillment from my personal life and achievements and passions there, more than enough. With that optics, office politics and promotion rat race and what people do in them just makes me smile. Seeing how otherwise smart folks ruin (or miss out) their actual lives and families in pursuit of excellence in a very narrow direction, often hard underappreciated by employers and not rewarded adequately. I mean, at certain point you either grok the game and optimize, or you don't.

The work brings over time modest wealth, allows me and my family to live in long term safe place (Switzerland) and builds a small reserve for bad times (or inheritance, early retirement etc. this is Europe, no need to save up for kids education or potentially massive healthcare bills). Don't need more from life.

qwerpy

Agree. Now I watch the rat racers with bemusement while I put in just enough to get a paycheck. I have enough time and energy to participate deeply in my children’s upbringing.

I’m in America so the paychecks are very large, which helps with private school, nanny, stay at home wife, and the larger net worth needed (health care, layoff risk, house in a nicer neighborhood). I’ve been fortunate, so early retirement is possible now in my early 40s. It really helps with being able to detach from work, when I don’t even care if I lose my job. I worry for my kids though. It won’t be as easy for them. AI and relentless human resources optimization will make tech a harder place to thrive.

Clubber

>It's got me wondering: do any of my hard work actually matter?

It mattered enough for someone to pay you money to do it, and that money put food on the table and clothes on your body and a roof over your head and allowed you to contribute to larger society through paying taxes.

Is it the same as discovering that E = MC2 or Jonas Salk's contributions? No, but it's not nothing either.

Phanteaume

You're on the right path, don't fall back into the global gaslight. Go deeper.

http://youtube.com/watch?v=9lDTdLQnSQo

kjkjadksj

Most work is redundant and unnecessary. Take for example the classic gas station on every corner situation that often emerges. This turf war between gas providers (or their franchisees by proxy they granted a license to this location for) is not because three or four gas stations are operating at maximum capacity. No, this is 3 or 4 fisherman with a line in the river, made possible solely because inputs (real estate, gas, labor, merchandise) are cheap enough where the gas station need not ever run even close to capacity and still return a profit for the fisherman.

Who benefits from the situation? You or I who don’t have to make a u turn to get gas at this intersection, perhaps, but that is not much benefit in comparison for the opportunity cost of not having 3 prime corner lots squandered on the same single use. The clerk at the gas station for having a job available? Perhaps although maybe their labor in aggregate would have been employed in other less redundant uses that could benefit out society otherwise than selling smokes and putting $20 on 4 at 3am. The real beneficiary of this entire arrangement is the fisherman, the owner or shareholder who ultimately skims from all the pots thanks to having what is effectively a modern version of a plantation sharecropper, spending all their money in the company store and on company housing with a fig leaf of being able to choose from any number of minimum wage jobs, spend their wages in any number of national chain stores, and rent any number of increasingly investor owned property. Quite literally all owned by the same shareholders when you consider how people diversify their investments into these multiple sectors.

senordevnyc

We benefit because when there’s only one gas station, they can charge more than if there are four.

arealaccount

Its why executive types are all hyped about AI. Being able to code 2x more will mean they get 2x more things (roughly speaking), but the workers aren’t going to get 2x the compensation.

npteljes

Indeed. And AI does its work without those productivity-hindering things like need for recreation and sleep, ethical treatment, and a myriad of others. It's a new resource to exploit, and that makes everyone excited who is building on some resource.

pythonguython

AI can’t do our jobs today, but we’re only 2.5 years from the release of chatGPT. The performance of these models might plateau today, but we simply don’t know. If they continue to improve at the current rate for 3-5 more years, it’s hard for me to see how human input would be useful at all in engineering.

codr7

They will never be creative, and creativity is a pretty big deal.

pythonguython

To the extent it’s measurable, LLMs are becoming more creative as the models improve. I think it’s a bold statement to say they’ll NEVER be creative. Once again, we’ll have to see. Creativity very well could be emergent from training on large datasets. But also it might not be. I recommend not speaking in such absolutes about a technology that is improving every day.

lsy

I feel like people in the comments are misunderstanding the findings in the article. It’s not that people save time with AI and then turn that time to novel tasks; it’s that perceived savings from using AI are nullified by new work which is created by the usage of AI: verification of outputs, prompt crafting, cheat detection, debugging, whatever.

This seems observationally true in the tech industry, where the world’s best programmers and technologists are tied up fiddling with transformers and datasets and evals so that the world’s worst programmers can slap together temperature converters and insecure twitter clones, and meanwhile the quality of the consumer software that people actually use is in a nosedive.

acedTrex

> where the world’s best programmers and technologists are tied up fiddling with transformers and datasets and evals so that the world’s worst programmers can slap together temperature converters and insecure twitter clones

This statement is incredibly accurate

ausbah

the state of consumer software is already so bad & LLMs are trained on a good chunk of that so their output can possible produce worse software right? /s

_heimdall

This is effectively Jevans paradox[1] in action.

The cost, in money or time, for getting certain types of work done decreases. People ramp up demand to fill the gap, "full utilization" of the workers.

Its a very old claim that the next technology will lead to a utopia where we don't have to work, or we work drastically less often. Time and again we prove that we don't actually want that.

My hypothesis (I'm sure its not novel or unique) is that very few people know what to do with idle hands. We tend to keep stress levels high as a distraction, and tend to freak out in various ways if we find ourselves with low stress and nothing that "needs" to be done.

[1] https://en.m.wikipedia.org/wiki/Jevons_paradox

vjvjvjvjghv

I think a lot of people would be fine being idle if they had a guaranteed standard of living. When I was unemployed for a while, I was pretty happy in general but stressed about money running out. Without the money issue the last thing I would want to do is to sell my time to a soulless corporation. I have enough interests to keep me busy. Work just sucks up time I would love to spend on better things.

_heimdall

Oh for sure, I should have included that. I was thinking of people being idle by choice rather than circumstance.

n_ary

> Its a very old claim that the next technology will lead to a utopia where we don't have to work, or we work drastically less often. Time and again we prove that we don't actually want that.

It actually does but due to wrong distribution of reward gained from that tech(automation) it does not work for the common folks.

Lets take a simple example, you, me and 8 other HN users work in Bezos’ warehouse. We each work 8h/day. Suddenly a new tech comes in which can now do the same task we do and each unit of that machine can do 2-4 of our work alone. If Bezos buys 4 of the units and setting each unit to work at x2 capacity, then 8 of us now have 8h/day x 5 days x 4 weeks = 160h leisure.

Problem is, now 8 of us still need money to survive(food, rent, utilities, healthcare etc). So, according to tech utopians, 8 of us now can use 160h of free time to focus on more important and rewarding works.(See in context of all the AI peddlers, how using AI will free us to do more important and rewarding works!). But to survive my rewarding work is to do gig work or something of same effort or more hours.

So in theory, the owner controlling the automation gets more free time to attend interviews and political/social events. The people getting automated away fall downward and has to work harder to maintain their survivality. Of course, I hope our over enthusiastic brethren who are paying LLM provider for the priviledge of training their own replacements figure the equation soon and don’t get sold by the “free time to do more meaningful work” same way the Bezos warehouse gave some of us some leisure while the automation were coming online and needed some failsafe for a while. :)

BriggyDwiggs42

I don’t think it’s the consequence of most individuals’ preferences. I think it’s just the result of disproportionate political influence held by the wealthy, who are heavily incentivized to maximize working hours. Since employers mostly have that incentive, and since the political system doesn’t explicitly forbid it, there aren’t a ton of good options for workers seeking shorter hours.

hatefulmoron

> there aren’t a ton of good options for workers seeking shorter hours.

But you do have that option, right? Work 20 hours a week instead of 40. You just aren't paid for the hours that you don't work. In a world where workers are exchanging their labor for wages, that's how it's supposed to work.

For there to be a "better option" (as in, you're paid money for not working more hours) what are you actually being paid to do?

For all the thoughts that come to mind when I say "work 20 hours a week instead of 40" -- that's where the individual's preference comes in. I work more hours because I want the money. Nobody pays me to not work.

BriggyDwiggs42

>nobody pays me not to work. If you’re in the US, then in theory you’re getting overtime for going over 40hrs a week. That’s time and a half for doing nothing, correct? I’d expect your principles put you firmly against overtime pay.

>But you do have that option, right? Work 20 hours a week instead of 40. You just aren't paid for the hours that you don't work. In a world where workers are exchanging their labor for wages, that's how it's supposed to work.

Look the core of your opinion is the belief that market dynamics naturally lead to desirable outcomes always. I simply don’t believe that, and I think interference to push for desirable outcomes which violate principles of a free market is often good. We probably won’t be able to agree on this.

dragonwriter

> But you do have that option, right?

Not really. Lots of kinds of work don’t hire part timers in any volume period. There are very limited jobs where the only tradeoff if you want to work fewer hours is a reduction in compensation proportional to the reduction in hours worked, or even just a reduction in compensation even if disproportionate to the reduction in hours worked.

vjvjvjvjghv

At least in the US part time work often not really a thing. A while ago I talked to HR about reducing to 32 hours and they didn't seem to get the idea at all. It's either all in or nothing. In the US there is also the health insurance question.

For my relatives in Germany going part time seems easier and more accepted by companies.

linsomniac

>technology will lead to a utopia where we don't have to work

I'm kind of ok with doing more work in the same time, though if I'm becoming way more effective I'll probably start pushing harder on my existing discussions with management about 4 day work weeks (I'm looking to do 4x10s, but I might start looking to negotiate it to "instead of a pay increase, let's keep it the same but a 4x8 week").

If AI lets me get more done in the same time, I'm ok with that. Though, on the other hand, my work is budgeting $30/mo for the AI tools, so I'm kind of figuring that any time that personally-purchased AI tools are saving me, I deduct from my work week. ;-)

>very few people know what to do with idle hands

"Millions long for immortality that don't know what to do with themselves on a rainy Sunday afternoon." -- Susan Ertz

nathan_douglas

Thank you! I didn't know this had a name. I remember thinking something along these lines in seventh grade social studies when we learned that Eli Whitney's cotton gin didn't actually end up improving conditions for enslaved people.

I suspected this would be the case with AI too. A lot of people said things like "there won't be enough work anymore" and I thought, "are you kidding? Do you use the same software I use? Do you play the same games I've played? There's never enough time to add all of the features and all of the richness and complexity and all of the unit tests and all of the documentation that we want to add! Most of us are happy if we can ship a half-baked anything!"

The only real question I had was whether the tech sector would go through a prolonged, destructive famine before realizing that.

Retric

Food production is a class case where once productivity is high enough you simply get fewer farmers.

We are currently a long way from that kind of change as current AI tools suck by comparison to literally 1,000x increases in productivity. So, in well under 100 years programming could become extremely niche.

_heimdall

We are seeing an interesting limit in the food case though.

We increased production and needed fewer farmers, but we now have so few farmers that most people have very little idea of what food really is, where it comes from, or what it takes to run our food system.

Higher productivity is good to a point, but eventually it risks becoming too fragile.

master_crab

100%. In fact, this exact scenario is playing out in the cattle industry.

Screwworm, a parasite that kills cattle in days is making a comeback. And we are less prepared for it this time because previously (the 1950s-1970s) we had a lot more labor in the industry to manually check each head of cattle. Bloomberg even called it out specifically.

Ranchers also said the screwworm would be much deadlier if it were to return, because of a lack of labor. “We can’t fight it like we did in the ’60s, we can’t go out and rope every head of cattle and put a smear on every open wound,” Schumann said.

https://www.bloomberg.com/news/features/2025-05-02/deadly-sc...

quantumHazer

> Food production is a class case where once productivity is high enough you simply get fewer farmers.

Yes, but.

There are more jobs in other fields that are adjacent to food production, particularly in distribution. Middle class does not existed and retail workers are now a large percentage of workers in most parts of the world.

Retric

Sure, but when farmers where 90% of the labor force many of the remaining 10% also related to food distribution and production, a village blacksmith was mostly in support of farming, salt production/transport for food storage, etc.

Food is just a smaller percentage of the economy overall.

anticensor

Did you type in an additional zero there?

Retric

Nope, where a family might struggle to efficiently manage 50 acres under continuous cultivation even just a few hundred years ago, now it’s not uncommon to see single family farms with 20,000 acres each of which is several times more productive.

It’s somewhat arbitrary where you draw the line historically but it’s not just maximum productivity worth remembering crops used to fail from drought etc far more frequently.

Small hobby farms are also a thing these days, but that’s a separate issue.

zdragnar

Econ 101: supply is finite, demand infinite. Increased efficiency of production means that demand will meet the new price point, not that demand will cease to exist.

There are probably plenty of goods that are counter examples, but time utilization isn't one of them, I don't think.

tehjoker

> Time and again we prove that we don't actually want that.

That's the capitalist system. Unions successfully fought to decrease the working day to 8 hrs.

_heimdall

I don't think we can so easily pin it on capitalism. Capitalism brings incentives that drive work hours and expectations up for sure, but that's not the only thing in play.

Workers are often looking to make more money, take more responsibility, or build some kind of name or reputation for themselves. There's absolutely nothing wrong with that, but that goal also incentivizes to work harder and longer.

There's no one size fits all description for workers, everyone's different. The same is true for the whole system though, it doesn't roll up to any one cause.

tehjoker

What you say is true, but the dominant effect in the system driving it towards more exertion than anyone would find desirable is the profit incentive of owners to drive their workers harder.

alexpotato

My dad has a great quote on computers and automation:

"In the 1970s when office computers started to come out we were told:

'Computers will save you SO much effort you won't know what to do with all of your free time'.

We just ended up doing more things per day thanks to computers."

perilunar

It’s Solow’s paradox: “You can see the computer age everywhere, except in productivity statistics.” — Nobel Prize-winning American economist Robert Solow, in 1987

joshdavham

Your dad sounds like a wise man!

CaptainFever

This is what the "AI will be a normal technology" camp is telling the "AI is going to put us all out of work!" camp all along. It's always been like this.

TekMol

When it comes to programming, I would say AI has about doubled my productivity so far.

Yes, I spend time on writing prompts. Like "Never do this. Never do that. Always do this. Make sure to check that.". To tell the AI my coding preferences. Bot those prompts are forever. And I have written most of them months ago, so that now I just capitalize on them.

rcruzeiro

Would you be comfortable sharing a bit about the kind of work you do? I’m asking because I mostly write iOS code in Swift, and I feel like AI hasn’t been all that helpful in that area. It tends to confidently spit out incorrect code that, even when it compiles, usually produces bad results and doesn’t really solve the problem I’m trying to fix.

That said, when I had to write a Terraform project for a backend earlier this year, that’s when generative AI really shined for me.

rdn

For ios/swift the results reflect the quality of the information available to the LLM.

There is a lack of training data; Apple docs arent great or really thorough, much documentation is buried in WWDC videos and requires an understanding of how the APIs evolved over time to avoid confusion when following stackoverflow posts, which confused newcomers as well as code generators. Stackoverflow is also littered with incorrect or outdated solutions to iOS/Swift coding questions.

anshumankmr

Cannot comment on swift but I presume training data for it might be less avaialble online. Whereas Python, what I use and in my anecdotal experience, it can produce quite decent code, and some sparks of brilliance here and there. But I use it for boilerplate code I find boring, not the core stuff. I would say as time progresses and these models get more data it may help with Swift too (though this issue may take a while cause I remember a convo with another person online who said the swift code GPT3.5 produced was bad, referencing libraries that did not exist.)

TekMol

Which LLMs have you used? Everything from o3-mini has been very useful to me. Currently I use o3 and gemini-2.5 pro.

I do full stack projects, mostly Python, HTML, CSS, Javascript.

I have two decades of experience. Not just my work time during these two decades but also much of my free time. As coding is not just my work but also my passion.

So seeing my productivity double over the course of a few months is quite something.

My feeling is that it will continue to double every few months from now on. In a few years we can probably tell the AI to code full projects from scratch, no matter how complex they are.

roywiggins

I think LLMs are just better at Python and JS than other languages, probably because that's what they're more extensively trained on.

codr7

As long as you're just rebuilding what already exists, yes.

genghisjahn

I’ve found it to be really helpful with golang.

With swift it was somewhat helpful but not nearly as much. Eventually stopped using it for swift.

natch

Sometimes it’s PEBCAK. You have to push back on bad code and it will do better. Also not specifying the model used is a red flag.

null

[deleted]

nico

> When it comes to programming, I would say AI has about doubled my productivity so far

For me it’s been up to 10-100x for some things, especially starting from scratch

Just yesterday, I did a big overhaul of some scrapers, that would have taken me at least a week to get done manually (maybe doing 2-4 hrs/day for 5 days ~ 15hrs). With the help of ChatGPT, I was done in less than 2 hours

So not only it was less work, it was a way shorter delivery time

And a lot less stress

mountainriver

Agree! I love this aspect, coding feels so smooth and fun now

nico

Yes! It has definitely brought back a lot of joy in coding for me

croes

Is the code audit included in that 2 hours?

nico

This didn’t require PRs

But, it did require passing tests

Most of the changes in the end were relatively straightforward, but I hadn’t read the code in over a year.

The code also implemented some features I don’t use super regularly, so it would’ve taken me a long time to load everything up in my head, to fully understand it enough, to confidently make the necessary changes

Without ai, it would have also required a lot of google searches finding documentation and instructions for setting up some related services that needed to be configured

And, it would have also taken a lot more communication with the people depending on these changes + having someone doing the work manually while the scrapers were down

So even though it might have been a reduction of 15hrs down to 1.5hrs for me, it saved many people a lot of time and stress

humanrebar

Excellent question. Maybe people will use this newfound productivity to actually review, test, and document code. Maybe.

Bjorkbat

I'm always a little bit skeptical whenever people say that AI has resulted in anything more than a personal 50% increase in productivity.

Like, just stop and think about it for a second. You're saying that AI has doubled your productivity. So, you're actually getting twice as much done as you were before? Can you back this up with metrics?

I can believe AI can make you waaaaaaay more productive in selective tasks, like writing test conditions, making quick disposable prototypes, etc, but as a whole saying you get twice as much done as you did before is a huge claim.

It seems more likely that people feel more productive than they did before, which is why you have this discrepancy between people saying they're 2x-10x more productive vs workplace studies where the productivity gain is around 25% on the high end.

TekMol

I'm surprised there are developers who seem to not get twice as much done with AI than they did without.

I see it happening right in front of my eyes. I tell the AI to implement a feature that would take me an hour or more to implement and after one or two tries with different prompts, I get a solution that is almost perfect. All I need to do is fine-tune some lines to my liking, as I am very picky when it comes to code. So the implementation time goes down from an hour to 10 minutes. That is something I see happening on a daily basis.

Have you actually tried? Spend some time to write good prompts, use state of the art models (o3 or gemini-2.5 pro) and let AI implement features for you?

shafyy

Even if what you are saying is true, a significant part of a developer's time is not writing code, but doing other things like thinking about how to best solve a problem, thinking about the architecture, communicating with coworkers, and so on.

So, even if AI helps you write code twice as fast, it does not mean that it makes you twice as productive in your job.

Then again, maybe you really have a shitty job at a ticket factory where you just write boilerplate code all day. In which case, I'm sorry!

zeroonetwothree

There are specific subsets of work at which it can sometimes be a huge boost. That’s a far cry from making me 2x more productive at my job overall.

Bjorkbat

I mean, I don't disagree with you when you say that something that would take an hour or more to implement would only take 10 minutes or so with AI. That kind of aligns with my personal experience. If something takes an hour, it's probably something that the LLM can do, and I probably should have the LLM do it unless I see some value in doing it myself for knowledge retention or whatever.

But working on features that can fit within a timebox of "an hour or more" takes up very little of my time.

That's what I mean, there are certain contexts where it makes sense to say "yeah, AI made me 2x-10x more productive", but taken as a whole just how productive have you become? Actually being 2x productive as a whole would have a profound impact.

ptx

> those prompts are forever

Have you tested them across different models? It seems to me that even if you manage to cajole one particular model into behaving a particular way, a different model would end up in a different state with the same input, so it might need a completely different prompt. So all the prompts would become useless whenever the vendor updates the model.

null

[deleted]

yubblegum

What is it like to maintain the code? How long have they been in production? How many iterations (enhancements, refactoring, ...) cycles have you seen with this type of code?

TekMol

It's not different from code I write myself.

I read each line of the commit diff and change it, if it is not how I would have done it myself.

risyachka

GG, you do twice the work, twice the mental strain for same wage. And you spend time on writing prompts instead of mastering your skills, thus becoming less competitive as a professional (as anyone can use ai, thats a given level now) Sounds like a total win.

gabrieledarrigo

And...? Does it result in a double salary, perhaps?

IshKebab

Obviously not because AI is available to everyone and salary isn't only a function of work completed.

gabrieledarrigo

Exactly that!

giantg2

The real problem is with lower skilled positions. Either people in easier roles or more junior people. We will end up with a significant percent of the population who are unemployable because we lack positions commensurate with their skills.

hwillis

Lower skilled than office clerks and customer service representatives? Because they were in the study.

cadamsdotcom

2023-24 models couldn’t be relied on at smaller levels thanks to hallucinations and poor instruction following; newer models are much better and that trend will keep going. That low level reliability allows models to be a building block for bigger systems. Check out this personal assistant done by Nate Herk, a youtuber who builds automations with n8n:

https://youtube.com/watch?v=ZP4fjVWKt2w

It’s early. There are new skills everyone is just getting the hang of. If the evolution of AI was mapped to the evolution of computing we would be in the era of “check out this room-sized bunch of vacuum tubes that can do one long division at a time”.

But it’s already exciting, so just imagine how good things will get with better models and everyone skilled in the art of work automation!

qoez

That's the story of all technology and the argument AI won't take jobs pmarca etc has been predicting for a while now. Our focus will be able to shift into ever narrower areas. Cinema was barely a thing 100 years ago. A hundred years from now we'll get some totally new industry thanks to freeing up labor.

Mbwagava

Cinema created jobs though, it didn't reduce them. Furthermore the value of film is obvious. You need to extremely hedge an LLM to pitch it to anyone.

pimlottc

> Cinema created jobs though, it didn't reduce them.

Is it that straightforward? What about theater jobs? Vaudeville?

atonse

Tough to say how it maps but with cinema, you have so many different skill sets needed for every single film. Costumes, builders for sets, audio engineers, the crews, the caterers, location scouts, composers, etc.

In live theater it would be mostly actors, some one time set and costume work, and some recurring support staff.

But then again, there are probably more theaters and theater production by volume.

Mbwagava

Fair point, but that's hardly applicable to the llm metaphor. If you're ok with shit work you can just run a program.

api

Also the nature of software is that the more software is written the more software needs to be written to manage, integrate, and make use of all the software that has been written.

AI automating software production could hugely increase demand for software.

The same thing happened as higher level languages replaced manual coding in assembly. It allowed vastly more software and more complex and interesting software to be built, which enlarged the industry.

bluefirebrand

> AI automating software production could hugely increase demand for software

Let's think this through

1: AI automates software production

2: Demand for software goes through the roof

3: AI has lowered the skill ceiling required to make software, so many more can do it with a 'good-enough' degree of success

4: People are making software for cheap because the supply of 'good enough' AI prompters still dwarfs the rising demand for software

5: The value of being a skilled software engineer plummets

6: The rich get richer, the middle class shrinks even further, and the poor continue to get poorer

This isn't just some kind of wild speculation. Look at any industry over the history of mankind. Look at Textiles

People used to make a good living crafting clothing, because it was a skill that took time to learn and master. Automation makes it so anyone can do it. Nowadays, automation has made it so people who make clothes are really just operating machines. Throughout my life, clothes have always been made by the cheapest overseas labour that capital could find. Sometimes it has even turned out that companies were using literal slaves or child labour.

Meanwhile the rich who own the factories have gotten insanely wealthy, the middle class has shrunk substantially, and the poor have gotten poorer

Do people really not see that this will probably be the outcome of "AI automates literally everything"?

Yes, there will be "more work" for people. Yes, overall society will produce more software than ever

McDonalds also produces more hamburgers than ever. The company makes tons of money from that. The people making the burgers usually earn the least they can legally be paid

ImHereToVote

This assumes we won't achieve AGI. If we do, all bets are off. Perhaps neuromorphic hardware will get is there.

sph

Let's achieve AGI first before making predictions.

dsign

When AGI is achieved, it will be able to make the predictions. And everything else. And off we the humans go to the reserve.

Mbwagava

Is there any evidence that AGI is a meaningful concept? I don't want to call it "obviously" a fantasy, but it's difficult to paint the path towards AGI without also employing "fantasize".

ceejayoz

I mean, humans exist. We know a blob of fat is capable of thought.

bwfan123

There is no point in these ill-formed hypothetical untestable assumptions.

- Assuming god comes to earth tomorrow, earth will be heaven

- Assuming an asteroid strikes earth in the future we need settlements on mars

etc, pointless discussion, gossip, and bs required for human bonding like on this forum or in a bierhauz

risyachka

Id we do it will be able to grow food, build houses etc using humanoids or other robots.

We won’t need jobs so we would be just fine.

layer8

How would you pay for those robots without a job? Or do you think whoever makes them will give them to you for free? Maybe the AI overlord will, but I doubt it.

littlestymaar

The agricultural revolution did in fact reduce the amount of work in society by a lot though. That's why we can have week-ends, vacation, retirement and study instead of working from non stop 12yo to death like we did 150 years earlier.

Reducing the amount of work done by humans is a good thing actually, though the institutional structures must change to help spread this reduction to society as a whole instead of having mass unemployment + no retirement before 70 and 50 hours work week for those who work.

AI isn't a problem, unchecked capitalism can be one.

vunderba

That's not really why (at least in the U.S.) - it was due to strong labor laws otherwise post industrial revolution you'd still have people working 12 hours a day 7 days a week - though with minimum wage stagnation one could argue that many people have to do this anyway just to make ends meet.

https://firmspace.com/theproworker/from-strikes-to-labor-law...

littlestymaar

That's exactly what I'm saying! You need labor laws so that you can lower the amount of work across the board and not just in average.

But you can't have labor laws that cut the amount worked by half if you have no way to increase productivity.

UncleOxidant

The agricultural revolution has been very beneficial for feeding more people with less labor inputs, but I'm kind of skeptical of the claim that it led to weekends (and the 40hr workweek). Those seem to have come from the efforts of the labor movement on the manufacturing side of things (late 19th, early 20th century). Business interests would have continued to work people 12hrs a day 7 days a week (plus child labor) to maximize profits regardless of increasing agricultural efficiency.

littlestymaar

Please re-read my comment, it says exactly the same thing as you are.

bilsbie

Medical is our fastest growing employer and you could make the case that modern agriculture produced most of that demand:

Obesity, mineral depletion, pesticides, etc.

So in a way automation did make more work.

nialv7

Work will expand to fill the time available.

(I know this is not the commonly accepted meaning of Parkinson's law.)

analog31

This reminds me of a thought I had about driver-less trucks. The truck drivers who get laid off will be re-employed as security guards to protect the automated trucks from getting robbed.

crazygringo

That's an amusing idea, but won't happen. Trucks will just be made more secure, that much harder to open, if theft starts to increase.

layer8

Only if that’s cheaper than security guards. “Just” hiring security guards may be more cost-effective than “just” making trucks more robbery-resistant.

crazygringo

Of course it's going to be cheaper.

If a truck has a lifetime of 20 years, that's 20 years' worth of paying a security guard for it.

You really think it could take 20 years' worth of human effort in labor and materials to make a truck more secure? The price of the truck itself in the first place doesn't even come close to that.