AI isn't going to kill the software industry
105 comments
·January 24, 2025simonw
wruza
I think this effect will be even greater this time (last time being higher-level “slow” languages like python and js), because AI will allow for a new wave of developers who won’t care about the “right” code that much and will perceive it as a disposable resource rather than a form of art. This aligns well with many smaller businesses that are by nature temporary or very dynamic and have to actually fight with developers tendencies to create beautiful solutions after the ship has sailed.
paulryanrogers
IME the software quality (including my own) is already on the edge of the unmaintainable slop threshold. I don't think it can slip much further without taking their hosts down too. Even Apple's software quality seems to be hit or miss lately.
gsf_emergency_2
As it becomes easier and more profitable to accrue tech debt, more tech debt will be accrued..
See also: Jevon's paradox
>[Tech companies], both historical and modern, typically expect that higher [revenues] will lower [tech debt], rather than expecting the Jevons paradox
afavour
I feel like a lot of those problems are already addressed by no-code products. CRMs, project management tools, web site builders… those with relatively straightforward needs who aren’t that fussy about how it gets done are already served. I don’t doubt AI will help some people here but I’m not convinced it’ll be by an industry changing amount.
spamizbad
It depends on the definition of “right”
As software becomes more essential to a business its reliability becomes more important. If your customers can tolerate defects or downtime it’s a signal that:
A) you’re not providing any real value
B) You provide so much additional value compared to your competition you still come out ahead in the wash
C) Your customers are hostages via vendor lock-in
A and C are the most common cases of persistent bad software.
jamil7
I just witnessed an example of more or less this at a startup I’ve been contracting at. The engineering team on the core product is tiny with no slack for extra projects. One non-technical founder and a few other non-technical people build a prototype piece of software using AI and low code tools for facilitating another revenue stream. They started using it with a few customers and raised more money around it. The money they raised is going directly into expanding the engineering team to work on both products.
etrautmann
This is essentially Jevon's paradox [1] applied to software development. As it becomes easier and more efficient to create software, more if it will be consumed/demanded/created.
[1] https://en.wikipedia.org/wiki/Jevons_paradox
edit: whoops that's the point of the original article
gsf_emergency_2
I'd go further than TFA and say it with tech debt..
As it becomes easier and more profitable to accrue tech debt, more tech debt will be accrued
jameslk
> More software means more jobs for increasingly efficient software developers
This assumes software developers will be the ones needed to meet the demand. I think it will be more like technical product managers
simonw
I think it will be the software developers that lean more into the kind of skills you see in technical product managers, or maybe vice-versa.
swatcoder
You're splitting hairs over a title, but effectively talking about the same individuals.
Whatever juice there is to squeeze out of generative AI coding, the people who will squeeze the most from it in the near future (10-20 years) are current and incoming software developers. Some may adopt different titles as the role matures, especially if there's a pay differential. This already happened with data engineering, database administration, etc
So it's possible that the absolute number of people carrying the title approximately like "software developer" may be fewer in 10 years than it is today, although I personally find that very unlikely. But the people leading and maturing whatever alternate path comes up, for the next generation or so, will more often than not have a professional or academic background in software development regardless.
The whole point of the article is that generative AI represents a lever that amplifies the capabilities of people who best know how to apply it. And for software development, that's ultimately going to be software developers.
jameslk
> You're splitting hairs over a title, but effectively talking about the same individuals.
I disagree. SWE skills are not the same as TPM skills. Those who like doing SWE work may not like TPM work nor be good at it. And my point is TPM skills will be what’s likely more needed. Therefore they may not be the same individuals, though many will need to adapt. Or go into woodworking or something
nicoburns
> I think it will be more like technical product managers
Even with traditional software development you'll be a much more effective software developer with good product management skills in a lot of niches (basically anything that's building a something user-facing)
QuiDortDine
But who will they blame when the solution inevitably fails to live up to the inherently contradictory demands of the different stakeholders?
deepGem
Essentially software is going to become like email. Dirt cheap to produce. The folks who will get employed and continue to make money like crazy will be the email automation company equivalents. Perhaps a bad analogy but the jobs are not gonna go away anywhere.
Engineers will still continue to work like crazy and produce 100x the output. The pay could still remain the same because the profit margins on these newly developed software is gonna be so much better.
That being said, I think there will be a cycle of adjustment - may be 2-5 years for this reality to set in. So in that interim there may be joblosses.
paulryanrogers
Doesn't this assume there will always be significant business needs that can be met by software, but are not yet? Or at least not as efficiently as they could be?
I imagine there is an upper bound on how much of the world can be eaten by software, and the trend seems to be getting closer to that point. Unless there are some massive breakthroughs in robotics or cybernetics which open more of the physical world to software.
There's also a point where incremental software improvement requires bulldozing, paving, or burning so much nature that we'll be worse off in the end. Watching billionaires squabble about who gets the new AI data centers makes me wonder if we haven't crossed over that point a few funding rounds ago.
throwup238
I don't think we’re anywhere near the upper bound yet and IMO the prevalence of SaaS software that could be better done in house if resources permitted demonstrates that. The future will be a lot more bespoke.
Like Simon’s quote above says, software is a competitive advantage so when one company develops software that makes them more efficient or grows revenue, competitors have to follow suit or get left behind. It’s an economic arms race. That’s why the dreaded outsourcing wave of the 2000s never materialized: companies ended up hiring a bunch more software engineers in the US and outsourced a bunch of other engineering to India and other countries.
The interesting question is how this will interact with current interest rates and the end of ZIRP.
paulryanrogers
Software is being commoditized by SAAS. Generally that means it's table stakes, not necessarily a competitive advantage.
natemwilson
I actually strongly believe the universe has an effectively infinite carrying capacity for software. This is because all systems can be improved upon recursively
andsoitis
> and the trend seems to be getting closer to that point.
What evidence suggests this to you?
paulryanrogers
Things that seem borderline worse when being done in software. Like touchscreens in car controls, beta self driving, shitty/broken websites that should be paper flyers on bulletin boards, a tablet at the barbers where a sign in sheet or paper numbers would do better, those "smart" doors some grocery stores tried, etc.
paradite
The counter argument is that if business can always find more projects that brings in revenue, they would not be laying off people, or shutting down.
The reality is business can't really just scale their revenue by adding more profitable or positive ROI projects. There are not enough of them to go around. Eventually you hit a plateau in terms of growth and the R&D / engineering productivity can't translate into revenue anymore.
jaredklewis
Like the grand parent, I think at a macro scale there will be more projects and more software engineering jobs over time as productivity increases.
But that’s across the whole economy. Individual businesses will very much behave idiosyncratically. Businesses tend to have an upside U in terms of their growth; unlike the economy as a whole, they don’t go up and to the right forever. And the economy as a whole, if you zoom out, does go up an to the right, but it is bumpy.
I guess in summary, I don’t think it makes sense to extrapolate long term trends by focusing on just the last 2 years, when the market for software engineers has cooled. After all, these cooler years followed what I can only feel were 3 or 4 years of a scalding hot market for software engineers.
captainkrtek
Very well put. The efficiency improvement I don’t think is the piece to debate, it’s pretty clear (like having auto complete for typing out text messages). I have yet to feel threatened (job security wise) by LLMs.
65
It's beneficial for executives to say AI will kill the software industry, it's a way to stoke fear in workers and a convenient way to say "with AI you could be x more productive," which makes the expectation that the worker should be however many times more productive than they already are, with or without AI "help". This is an attempt to increase hours worked at the same wages, which is itself an attempt to lower wages.
andsoitis
While this behavior may exist in pockets, my experience suggests that this is not broadly the general mental model / approach by executives. I think this is a narrative in some peoples heads, but is neither an accurate reflection of the world, nor a healthy way to approach employment.
65
You're probably right, though I speak from anecdotal experience, as the CTO of my company recently said that developers should be 5X more productive with AI. So what if I'm not 5X more productive with AI? Doesn't matter, because there's more tickets in the sprint that, in his head, I should theoretically be spending the same time working on as "before AI."
lexandstuff
Sounds like your CTO is dangerously incompetent and isn't actively using the AI tooling they are endorsing.
simonw
While I found myself in furious agreement with the section titled "Jevons Paradox", I'm less convinced by this argument from the "Comparative Advantage" section:
"While AI is powerful, it’s also computationally expensive. Unless someone decides to rewrite the laws of physics, there will always be a limit on how much artificial intelligence humanity can bring to bear."
The cost for running a prompt through the best-available model has collapsed over the past couple of years. GPT-4o is about 100x times less expensive than GPT-3 was, and massively more capable.
DeepSeek v3 and R1 are priced at a fraction of OpenAI's current prices for GPT-4 and o1 model and appear to be highly competitive with them.
I don't think we've hit the end of that trend yet - my intuition is that there's a lot more performance gains still to be had.
I don't think LLM systems will be competitive with everything that humans can do for a long time, if ever. But for the things they CAN do the cost is rapidly dropping to almost nothing.
thfuran
A human brain runs on about 20 W, and I see no reason to believe that that is the absolute limit of efficiency. It's probably quite efficient as far as mammalian meat goes, but evolution can only optimize somewhat locally.
mitthrowaway2
Indeed, the laws of physics don't put any limits on artificial intelligence that don't also apply to natural intelligence. It's a strange place to look for a comparative advantage argument.
null
TheDudeMan
> I don't think LLM systems will be competitive with everything that humans can do for a long time
It's looking like about 3 years. That will be another ~100x cost reduction, $100s more billions in infra, new training algorithms, new capabilities.
carbocation
I remember when Hinton said that that we should "stop training radiologists now" in 2016[1]. Meanwhile, radiologists are in high demand and are getting paid better than ever. I believe the same will be true for programmers in the future. Sure, some of the boilerplate will be handled for you, just like segmentation is for radiologists. That's great for everyone.
selimnairb
The notion of no longer training radiologists because computer vision algorithms and deep learning are good at detecting cancers in imagery strikes me troublingly naïve. Who fucking trained the AIs? Will the AIs magically be able to detect yet to be discovered maladies?
dboreham
Fwiw this is not new. I worked on a machine vision project to evaluate x-rays for cancer...in 1985.
cameldrv
A radiologist is training for a 35-40 year career though. I definitely would not have wanted to have started that training in 2016.
TheDudeMan
Did the radiologists suddenly get better than the AI at reading images? Or is the system simply unchangeable?
carbocation
A few responses:
- Since when have the radiologists ever been worse than AI at reading images?
- AI is doing mechanical tasks for the radiologists, so it stands to reason that this makes radiologist more efficient.
- The radiologist is a liability sponge, so if at all possible it of course makes sense to augment the radiologist with AI rather than to try to do away with them. (This roughly gets at your point about the system being unchangeable.)
nialv7
I am quite tired of seeing titles like this. No, you _don't know_. Vast majority of definitive statements like this is going to be meaningless. The whole point about this is that it's an uncertainty, the impact of AI on our society is unpredictable, you could be right, but you could be wrong too. And merely assigning a probability to this is going to be very non-trivial.
I just can't understand where people find the kind of confidence to say AI is (or is not) going to <insert your scenario here>.
agentultra
I haven’t seen a study yet that suggests AI tools enhance productivity of developers.
I wouldn’t take it as a given.
The study I have seen was from a company selling AI developer tools whose researchers were employed by said company. Not exactly and independent and bias-free study.
Personally they don’t work for me.
It’s not AI that is going to take out jobs. It’s the capital class that will do that. That’s what is changing the industry.
I suspect this will be the year we start to hear stories of folks getting let go for not using LLM codegen.
null
dboreham
I feel anecdotally that Stack Overflow significantly increased my productivity. By induction therefore so should an LLM.
ClimaxGravely
Interesting I found SO to not be that helpful even during the golden years.
I should not that I tend to work in codebases that have little to no public information though so that might be the differentiator. Presumably an LLM would be less useful due to that but I'm looking forward to trying again in the future.
parpfish
i agree that the industry wont be killed, but I do have some worries about what the future will look like.
- If we keep making AI-assistance tools that make mid- and senior-level ICs more and more efficient, where does that leave entry-level junior positions? It's already tough enough for juniors to get a foot in the door, but will it get even harder as we continue to make the established older devs more and more efficient?
- The current crop of AI-assistance tools are being tailored to meet the needs of mid- and senior-level ICs that learned programming in a pre-AI world. But incoming junior devs are "AI native" and may approach software development in a very different way.
- I would wager that there will be substantial workplace/generational divides between devs that learned programming before using AI assistance later vs "AI native" devs that had AI assistance the whole time. I have no idea what these new ways of working will be, but I'm curious to see how it plays out.
rahimnathwani
AI is going to make building software way cheaper and more profitable, but that's actually bad news for a lot of developers out there. Think about how many people are only employed because they know the basics of React or Django or whatever. They can copy-paste code and tweak existing patterns, but that's exactly what AI is getting really good at.
The developers who are actually going to thrive are the ones who can architect complex systems and solve gnarly technical problems. That stuff is getting more valuable, not less.
But a lot of folks have built careers on pretty basic skills. They've gotten by because there just aren't many humans who can do even simple technical work. That advantage is disappearing fast.
65
The barrier to entry will be raised, though isn't that always happening in software, regardless of if it's AI or not? Companies used to hire HTML email developers, for example. There are many HTML email builders out there that do the job for a marketing person.
Better tooling, if it's AI tooling or a framework, continuously changes the job requirements. Even your average React developer still has to deal with plenty of other things people in the past didn't have to think about. E.g. dependency management, responsive screen sizes for all screen widths, native apps, state management, etc.
rahimnathwani
"Better tooling, if it's AI tooling or a framework, continuously changes the job requirements."
This might be true if you work at a tech company, but it's not universally true. There are many people who are gainfully employed as software developers, based solely on technical knowledge they acquired years ago.
1shooner
"Cheaper software means people are going to want more of it. More software means more jobs for increasingly efficient software developers. Economists call this Jevons Paradox."
If we accept there will be increased demand for software, it's a big jump from that to concluding the efficiency of AI will be outpaced by the demand for software, specifically along the dimension of required developers.
Software isn't wheat or fuel, it can be reused and resold.
apeace
The job of "software engineer" as we know it will end.
Before the industrial revolution, shoemakers would make shoes. It was a specialized skill, meaning shoes were very expensive, so most people couldn't afford them.
Then factories were invented. Now shoes could be made cheaply and quickly, by machines, so more people could afford them. This meant that far more people could be employed in the shoe industry.
But those people were no longer shoemakers. Shoemakers were wiped out overnight.
Think of how huge the shoe industry is now. There are jobs ranging from factory worker to marketing manager. But there are zero shoemakers.
AI writing software doesn't mean it's the end of the industry. Humanity will benefit greatly, just like we did from getting cheaper shoes.
But the software engineers are screwed.
apeace
To add to this: the author is missing a major aspect of the Jevons paradox.
They keep referencing "more efficient software developers," but the Jevons paradox isn't only about efficiency. The efficiency creates lower cost, which in turn increases demand.
The main cost of software is software engineers. It's a specialized skill, so it's a high-salary job.
With AI doing most of the work, salaries will begin to fall. It will no longer make sense to study computer science, or spend years learning to code, for such a low salary. There will no longer be people doing what we call software engineering today.
So the author is right, Jevons paradox will take effect. But like I said above, it will replace the current industry with a very different-looking industry.
bamboozled
But those people were no longer shoemakers. Shoemakers were wiped out overnight.
Have you seen the cost and popularity of "Made in X" handmade boots though? Red Wing, Origin, Red Back. It's absolutely crazy
The difference is, all of a sudden we could make a lot of CHEAP shoes and yes I'm sure it wiped out a lot of shoe maker jobs, but there is still a lot of good shoe makers around and there is still a high demand for handmade shoes and boots.
Kerrick
“Crack the books”—can anybody recommend good books for this shared future of ours? I’m tired of trying to piece it together from blog posts, READMEs, and short video tutorials.
simonw
I haven't read these all the way through myself but I've seen enough of them that I'm confident suggesting them:
- Prompt Engineering for LLMs by John Berryman and Albert Ziegler: https://www.amazon.com/Prompt-Engineering-LLMs-Model-Based-A...
- AI Engineering by Chip Huyen, which I recommend based on the strength of this extract about "agents": https://huyenchip.com/2025/01/07/agents.html
airstrike
I like hands-on learning more than I like textbooks, so in case that matches your requirements, maybe try training your own GPT to have a sense for how it works. I wrote a Rust version of the famous https://github.com/karpathy/nanoGPT (which is in Python) so that I could learn how it's built.
I wrote it in Rust because I wanted to improve my skills in that language, be forced to write code instead of just reading the existing implementation so that I would truly learn, and test the quality of the nascent Rust AI/ML ecosystem, but you could pick your own language
brainbag
Would you say more about your experience writing it in Rust? It worked well, what didn't, anywhere you found that you struggled unexpectedly or that was easier than you expected?
FloorEgg
Thank you for this article, I really needed a positivity boost today.
It's a great reminder to not only consider what wonderful things could happen in the future, but also to want them and work towards them. Clarifying a positive future like this helps others consider it, want it, and make it happen.
I understand the case being made is more of a prediction than a wish, but it's also a vision. I believe clarifying a vision makes it more likely to happen, especially when people gravitate to it.
There is plenty of negativity already. Thanks again for the positive outlook.
I've been trying to put this effect into words for a while, now I don't have to - this is really clearly stated:
"AI tools create a significant productivity boost for developers. Different folks report different gains, but most people who try AI code generation recognize its ability to increase velocity. Many people think that means we’re going to need fewer developers, and our industry is going to slowly circle the drain.
This view is based on a misunderstanding of why people pay for software. A business creates software because they think that it will give them some sort of economic advantage. The investment needs to pay for itself with interest. There are many software projects that would help a business, but businesses aren’t going to do them because the return on investment doesn’t make sense.
When software development becomes more efficient, the ROI of any given software project increases, which unlocks more projects. [...] Cheaper software means people are going to want more of it. More software means more jobs for increasingly efficient software developers."