Skip to content(if available)orjump to list(if available)

AWS CEO says replacing junior devs with AI is 'one of the dumbest ideas'

alexgotoi

The thing people miss in these “replace juniors with AI” takes is that juniors were never mainly about cheap hands on keyboards. They’re the only people in the org who are still allowed to ask “dumb” questions without losing face, and those questions are often the only signal you get that your abstractions are nonsense.

What AI does is remove a bunch of the humiliating, boring parts of being junior: hunting for the right API by cargo-culting Stack Overflow, grinding through boilerplate, getting stuck for hours on a missing import. If a half-decent model can collapse that search space for them, you get to spend more of their ramp time on “here’s how our system actually fits together” instead of “here’s how for-loops work in our house style”.

If you take that setup and then decide “cool, now we don’t need juniors at all”, you’re basically saying you want a company with no memory and no farm system – just an ever-shrinking ring of seniors arguing about strategy while no one actually grows into them.

Always love to include a good AI x work thread in my https://hackernewsai.com/ newsletter.

simonw

Relevant post by Kent Beck from 12th Dec 2025: The Bet On Juniors Just Got Better https://tidyfirst.substack.com/p/the-bet-on-juniors-just-got...

> The juniors working this way compress their ramp dramatically. Tasks that used to take days take hours. Not because the AI does the work, but because the AI collapses the search space. Instead of spending three hours figuring out which API to use, they spend twenty minutes evaluating options the AI surfaced. The time freed this way isn’t invested in another unprofitable feature, though, it’s invested in learning. [...]

> If you’re an engineering manager thinking about hiring: The junior bet has gotten better. Not because juniors have changed, but because the genie, used well, accelerates learning.

beAbU

Isn't the struggling with docs and learning how and where to find the answers part of the learning process?

I would argue a machine that short circuits the process of getting stuck in obtuse documentation is actually harmful long term...

chaos_emergent

Isn't the struggle of sifting through a labyrinth of physical books and learning how and where to find the right answers part of the learning process?

I would argue a machine that short-circuits the process of getting stuck in obtuse books is actually harmful long term...

sfpotter

It may well be. Books have tons of useful expository material that you may not find in docs. A library has related books sitting in close proximity to one another. I don't know how many times I've gone to a library looking for one thing but ended up finding something much more interesting. Or to just go to the library with no end goal in mind...

GeoAtreides

When I first opened QBasic, <N> years ago, when I was a wee lad, the online QBasic help didn't replace my trusty qbasic book (it supplemented it, maybe), nor did it write the programs for me. It was just there, doing nothing, waiting for me to press F1.

AI, on the other hand...

ori_b

Well, yes -- this is why I still sit down and read the damn books. The machine is useful to refresh my memory.

jaapbadlands

Feel free to waste your time sifting through a dozen wrong answers. Meanwhile the rest of us can get the answers, absorb the right information quickly then move on to solving more problems.

Aurornis

I recall similar arguments being made against search engines: People who had built up a library of internal knowledge about where and how to find things didn't like that it had become so easy to search for resources.

The arguments were similar, too: What will you do if Google goes down? What if Google gives the wrong answer? What if you become dependent on Google? Yet I'm willing to bet that everyone reading this uses search engines as a tool to find what they need quickly on a daily basis.

CharlieDigital

I argue that there is a strong, strong benefit to reading the docs: you often pick up additional context and details that would be missing in a summary.

Microsoft docs are a really good example of this where just looking through the ToC on the left usually exposes me to some capability or feature of the tooling that 1) I was not previously aware of and 2) I was not explicitly searching for.

The point is that the path to a singular answer can often include discovery of unrelated insight along the way. When you only get the answer to what you are asking, you lose that process of organic discovery of the broader surface area of the tooling or platform you are operating in.

As a senior-dev, I have generally a good idea of what to ask for because I have built many systems and learned many things along the way. A junior dev? They may not know what to ask for and therefore, may never discover those "detours" that would yield additional insights to tuck into the manifolds of their brains for future reference.

rafaelmn

No, trying stuff out is the valuable process. How I search for information changed (dramatically) in the last 20 years I've been programming. My intuition about how programs work is still relevant - you'll still see graybeards saying "there's a paper from 70s talking about that" for every "new" fad in programming, and they are usually right.

So if AI gets you iterating faster and testing your assumptions/hypothesis I would say that's a net win. If you're just begging it to solve the problem for you with different wording - then yeah you are reducing yourself to a shitty LLM proxy.

tencentshill

The naturally curious will remain naturally curious and be rewarded for it, everyone else will always take the shortest path offered to complete the task.

schainks

Disagree. While documentation is often out of date, the threshold for maintaining it properly has been lowered, so your team should be doing everything it can to surface effective docs to devs and AIs looking for them. This, in turn, also lowers the barrier to writing good docs since your team's exposure to good docs increases.

If you read great books all the time, you will find yourself more skilled at identifying good versus bad writing.

supersour

I think if this were true, then individualized mastery learning wouldn't prove to be so effective

https://en.wikipedia.org/wiki/Mastery_learning

jimbokun

Why?

If you can just get to the answer immediately, what’s the value of the struggle?

Research isn’t time coding. So it’s not making the developer less familiar with the code base she’s responsible for. Which is the usual worry with AI.

lokar

For an experienced engineer, working out the syntax, APIs, type issues, understanding errors, etc is the easy part of the job. Larger picture issues are the real task.

But for many Jr engineers it’s the hard part. They are not (yet) expected to be responsible for the larger issues.

bdangubic

what is a larger issue? lacking domain knowledge? or lacking deeper understanding of years of shit in the codebase that seniors may have better understanding? where I work, there is no issue that it "too large" for a junior to take on, it is the only way that "junior" becomes "non-junior" - by doing, not by delegating to so-called seniors (I am one of them)

dclowd9901

"Larger issue" is overall technical direction and architecture, making decisions that don't paint you into a corner, establishing maintainability as a practice, designing work around an organization's structure and habit and so on.

But these are the things people learn through experience and exposure, and I still think AI can help by at least condensing the numerous books out there around technology leadership into some useful summaries.

ekkeke

You can't give a junior tasks that require experience and nuance that have been acquired over years of development. If you babysit them, then perhaps but then what is the point? By it's nature "nuance" is something hard to describe concretely but as someone who has mentored a fair few juniors most of them don't have it. AI generally doesn't have it either. Juniors need tasks at the boundary of their capability, but not far beyond to be able to progress. Simply allowing them to make a mess of a difficult project is not a good way to get there.

There is such a thing as software engineering skill and it is not domain knowledge, nor knowledge of a specific codebase. It is good taste, an abstract ability to create/identify good solutions to a difficult problem.

Aperocky

Unnecessary complexity, completely arbitrary one off designs, over emphasis on one part of the behavior while ignoring others. Using design patterns where they shouldn't be used, code once and forget operations exist, using languages and framework that are familiar but unfit for that purpose. The list goes on and I see it happen all the time, AI only makes it worse because it tend to verify all of these with "You're absolutely correct!".

Good luck maintaining that.

almosthere

We had 3 interns this past summer - with AI I would say they were VERY capable of generating results quickly. Some of the code and assumptions were not great, but it did help us push out some releases quickly to alleviate customer issues. So there is a tradeoff with juniors. May help quickly get features out, may also need some refactoring later.

turnsout

Interesting how similar this is to the tradeoff of using AI coding agents

SkyPuncher

*Some juniors have gotten better.

I hate to be so negative, but one of the biggest problems junior engineers face is that they don't know how to make sense of or prioritize the gluttony of new-to-them information to make decisions. It's not helpful to have an AI reduce the search space because they still can't narrow down the last step effectively (or possibly independently).

There are junior engineers who seem to inherently have this skill. They might still be poor in finding all necessary information, but when they do, they can make the final, critical decision. Now, with AI, they've largely eliminated the search problem so they can focus more on the decision making.

The problem is it's extremely hard to identify who is what type. It's also something that senior level devs have generally figured out.

lanfeust6

Search is easily the best feature of AI/LLMs.

alpha_squared

I kind of agree here. The mental model that works for me is "search results passed through a rock tumbler". Search results without attribution and mixed-and-matched across reputable and non-reputable sources, with a bias toward whatever source type is more common.

null

[deleted]

sublinear

That's arguably all it ever was. Generating content using AI is just finding a point in latent space.

GeoAtreides

>but because the genie, used well, accelerates learning.

This is "the kids will use the AI to learn and understand" level of cope

no, the kids will copy and paste the solution then go back to their preferred dopamine dispenser

CuriouslyC

I've learned a lot of shit while getting AI to give me the answers, because I wanted to understand why it did what it did. It saves me a lot of time trying to fix things that would have never worked, so I can just spend time analyzing success.

There might be value in learning from failure, but my guess is that there's more value in learning from success, and if the LLM doesn't need me to succeed my time is better spent pushing into territory where it fails so I can add real value.

skydhash

Do you honestly think that’s how people learn?

This is an example of a book on Common Lisp

https://gigamonkeys.com/book/practical-a-simple-database

What you usually do is follow the book instructions and get some result, then go to do some exploration on your own. There’s no walk in the dark trying to figure your own path.

Once you learn what works, and what does not, then you’ll have a solid foundation to tackle more complex subject. That’s the benefit of having a good book and/or a good teacher to guide you to the path of mastering. Using a slot machine is more tortuous than that.

irishcoffee

The amount of copium in the replies to this is just amazing. It’s amazing.

ivape

Don’t confuse this with this persons ability to hide their instincts. He is redefining “senior” roles as junior, but words are meaningless in a world of numbers. The $$$ translation is that something that was worth $2 should now be worth $1.

Because that makes the most business sense.

israrkhan

1. replacing junior engineers, with AI ofcourse breaks the talent pipeline. Seniors will retire one day, who is going to replace them? Are we taking the bet, that we wont need any engineer at that time? sounds dangerous.

2. Junior engineer's heavy reliance on AI tools is a problem in itself. AI tools learn from existing code that is written by senior engineers. Too much use of AI by junior engineers will result in deterioration of engineering skills. It will eventually result in AI learning from AI generated code. This is true for most other content as well, as more and more content on internet is AI generated.

orliesaurus

Interesting take... I'm seeing a pattern... People think AI can do it all... BUT I see juniors often are the ones who actually understand AI tools better than seniors... That's what AWS CEO points out... He said juniors are usually the most experienced with AI tools, so cutting them makes no sense... He also mentioned they are usually the least expensive, so there's little cost saving... AND he warned that without a talent pipeline you break the future of your org... As someone who mentors juniors, I've seen them use AI to accelerate their learning... They ask the right questions, iterate quickly, and share what they find with the rest of us... Seniors rely on old workflows and sometimes struggle to adopt new tools... HOWEVER the AI isn't writing your culture or understanding your product context... You still need people who grow into that... So I'm not worried about AI replacing juniors... I'm more worried about companies killing their own future talent pipeline... Let the genies help, but don't throw away your apprentices.

codegeek

"BUT I see juniors often are the ones who actually understand AI tools better than seniors"

Sorry, what does that mean exactly ? Are you claiming that a junior dev knows how to ask the right prompts better than a Senior dev ?

__s

Their implication is that junior devs have more likely built up their workflow around the use of AI tooling, likely because if they're younger they'll have had more plasticity in their process to adapt AI tooling

Overall I don't quite agree. Personally this applies to me, I've been using vim for the last decade so any AI tooling that wants me to run some electron app is a non starter. But many of my senior peers coming from VS Code have no such barriers

citrin_ru

Speaking of vim - adding and configuring copilot plugin for vim is easy (it runs a nodejs app in the background but if you have spare 500 Mb RAM it's invisible).

bongodongobob

If AI is just prompts to you, you fall into the "don't know how to use it" group.

perfmode

Some old dogs resist learning new tricks.

lvl155

7/10 senior devs (usually fellas 50+) will get mad at you for trying to use Claude Code. Me: “dude it writes better code than crap you write in your mush middle-age brain.” Also me: “I also have mush brain.”

I think LLM is a reflection of human intelligence. If we humans become dumber as a result of LLM, LLMs will also become dumber. I’d like to think in some dystopian world, LLM’s trained from pre 2023 data will be sought after.

orliesaurus

ON TOP OF IT ALL, juniors are the ones who bring novel tools to the desk MOST times...i.e. I had no clue the Google IDE gave you free unlimited credits despite the terrible UI...but a young engineer told me about it!!

bluGill

I've seen seniors and juniors bring novel tools in. Seniors do it less often perhaps - but only because we have seen this before under a different name and realize it isn't novel. (sometimes the time has finally come, sometimes it fail again for the same reason it failed last time)

JKCalhoun

I've seen seniors bring up novel ideas to juniors—well, novel to the juniors anyway.

Just an example. I've been in so many code bases over the years… I had a newer engineer come aboard who, when he saw some code I recently wrote with labels (!) he kind of blanched. He thought "goto == BAD". (We're talking "C" here.)

But this was code that dealt with Apple's CoreFoundation. More or less every call to CF can fail (which means returning NULL in the CF API). And (relevant) passing NULL to a CF call, like when trying to append a CF object to a CF array, was a hard crash. CF does no param checking. (Why, that would slow it down—you, dear reader, are to do all the sanity checking.)

So you might have code similar to:

CFDictionary dict = NULL;

dict = CFCreateDictionary();

if (!dict)

    goto bail;
You would likely continue to create arrays, etc—insert them into your dictionary, maybe return the dictionary at the end. And again, you checked for NULL at every call to CF, goto bail if needed.

Down past 'bail' you could CFRelease() all the non-null instances that you do not return. This was how we collected our own garbage. :-)

In any event, goto labels made the code cleaner: your NULL-checking if-statements did not have to nest crazy deep.

The new engineer admitted surprise that there might be a place for labels. (Or, you know, CF could have been more NULL-tolerant and simply exited gracefully.)

salawat

I'm just shocked people aren't clueing into the fact that tech companies are trying to build developer dependence on these things to secure a "rent" revenue stream. But hey, what do I know. It's just cloud hyper scaling all over again. Don't buy and drive your own hardware. Rent ours! Look, we built the metering and everything!

SketchySeaBeast

I'd hope people are. It's painfully obvious this entire AI push is rent-seeking half hidden by big tech money. At some point the free money is going to go away, but the rent for every service will remain.

JKCalhoun

> I'm seeing a pattern...

Me too. Fire your senior devs. (Ha ha, not ha ha.)

Ancalagon

No no, fire them.

Cannot wait for the 'Oh dear god everything is on fire, where is the senior dev?' return pay packages.

whazor

Amazon has an internal platform for building software. The workflows are documented and have checks and balances. So the CEO wants to have more junior developer that are more proficient with AI, and have (in ratio) less senior developers. Also, product context comes from (product) managers, UX designers.

For medium or small companies, these guardrails or documentation can be missing. In that case you need experienced people to help out.

WestCoader

Sorry but what the heck is up with all the ellipses in this comment?

raincole

They have an emacs package that triples their . automatically!

Mountain_Skies

It's a sort of stream of consciousness. That style of writing goes in and out of style from time to time but some people use it consistently.

debo_

They're trying really hard to make sure you know they didn't write their post with an LLM? /s

red-iron-pine

honestly i think that'll be a thing in the future

"bespoke, hand generated content straight to your best readers"

ch2026

is this just a janky summary cause you added zero new viewpoints

yieldcrv

you're right but my opinion about this has changed

I would have agreed with you 100% one year ago. Basically senior engineers are too complacent to look at AI tools as well as ego driven about it, all while corporate policy disincentivizes them from using anything at all, with maybe a forced Co-Pilot subscription. While junior engineers will take a risk that the corporate monitoring of cloud AI tools isn't that robust.

But now, although many of those organizations are still the same - with more contrived Co-Pilot subscriptions - I think senior engineers are skirting corporate policy too and become more familiar with tools.

I'm also currently in an organization that is a total free for all with as many AI coding and usage tools as necessary to deliver faster. So I could be out of touch already.

Perhaps more complacent firms are the same as they were a year ago.

pnathan

I - senior - can patch an application in an unknown language and framework with the AI. I know enough to tell it to stop the wildly stupid ideas.

But I don't learn. That's not what I'm trying to do- I'm trying to fix the bug. Hmm.

I'm pretty sure AI is going to lead us to a deskilling crash.

Food for thought.

pphysch

On the contrary, being able to access (largely/verifiably) correct solutions to tangible & relevant problems is an extremely great way to learn by example.

It should probably be supplemented with some good old RTFM, but it does get us somewhat beyond the "blind leading the blind" paradigm of most software engineering.

omnimus

I think the temptation to use AI is so strong that it will be those who will keep learning who will be valuable in future. Maybe by asking AI to explain/teach instead of asking for solution direclty. Or not using AI at all.

JeremyNT

I think seniors know enough to tell whether they need to learn or not. At least that's what I tell myself!

The thing with juniors is: those who are interested in how stuff works now have tools to help them learn in ways we never did.

And then it's the same as before: some hires will care and improve, others won't. I'm sure that many juniors will be happy to just churn out slop, but the stars will be motivated on their own to build deeper understanding.

frostiness

I can't help but feel this is backpedaling after the AI hype led to people entering university avoiding computer science or those already in changing their major. Ultimately we might end up with a shortage of developers again, which would be amusing.

mjr00

I went to university 2005-2008 and I was advised by many people at the time to not go into computer science. The reasoning was that outsourced software developers in low-cost regions like India and SEA would destroy salaries, and software developers should not expect to make more than $50k/year due to the competition.

Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.

This might be the professional/career version of "buy when there's blood in the streets."

avgDev

I went for CS in my late 20s, always tinkered with computers but didn't get into programming earlier. College advisor told me the same thing, and that he went for CS and it was worthless. This was 2012.

I had a job lined up before graduating. Now make high salary for the area, work remotely 98% of the time and have flexible schedule. I'm so glad I didn't listen to that guy.

dylan604

The one thing I learned in college is that the advisors are worthless. There's how many students? And you are supposed to expect they know the best thing for you? My advisor told me that all incoming freshmen must take a specific math class, a pre-calculus course, totally ignoring all of my AP exams that showed I was well beyond that. Wasted my time and money.

codegeek

My take is that these are not binary issues. With outsourcing, it is true that you can hire someone cheaper in Asian countries but it cannot kill all jobs locally. So what happens is that the absolute average/mediocre get replaced by outsourcing and now with AI while the top talent can still command a good salary because they are worth it.

So I think that a lot of juniors WILL get replaced by AI not because they are junior necessarily but because a lot of them won't be able to add great value compared to a default AI and companies care about getting the best value from their workers. A junior who understands this and does more than the bare minimum will stand out while the rest will get replaced.

hrimfaxi

> Even more recently we had this with radiologists, a profession that was supposed to be crushed by deep learning and neural networks. A quick Google search says an average radiologist in the US currently makes between $340,000 to $500,000 per year.

At the end of the day, radiologists are still doctors.

sublinear

Yup hearing big talk about competition and doom is a strong signal that there is plenty of demand.

You can either bet on the new unproven thing claiming to change things overnight, or just do the existing thing that's working right now. Even if the new thing succeeds, an overnight success is even more unrealistic. The insight you gain in the meantime is valuable for you to take advantage of what that change brings. You win either way.

bluGill

When there is no competition that is a sign there is no demand.

There can sometimes be too much competition, but often there is only the illusion of too much if you don't look at quality. You can find a lot of cheap engineers in India, but if you want a good quality product you will have to pay a lot more.

ravenstine

Can anyone really blame the students? If I were in their shoes, I probably wouldn't bother studying CS right now. From their perspective, it doesn't really matter whether AI is bullshit in any capacity; it matters whether businesses who are buying the AI hype are going to hire you or not.

Hell, I should probably be studying how to be a carpenter given the level at which companies are pushing vibe coding on their engineers.

simonw

"after the AI hype led to people entering university avoiding computer science or those already in changing their major"

That's such a terrible trend.

Reminds me of my peers back in ~2001 who opted not to take a computer science degree even though they loved programming because they thought all the software engineering jobs would be outsourced to countries like India and there wouldn't be any career opportunities for them. A very expensive mistake!

roncesvalles

Certainly, I even know of experienced devs switching out of tech entirely. I think the next couple of decades are going to be very good for software engineers. There will be an explosion of demand yet a contraction in supply. We are in 2010 again.

DiscourseFan

There will be programmers of the old ways, but AI is basically code 2.0, there are now a lot of things that are AI specific that those with traditional software development skills can’t do.

omnimus

Like what exactly?

fullshark

Or maybe they realize the AI needs humans in the loop for the foreseeable future for enterprise use cases and juniors (and people from LCL areas) are cheaper and make the economics make some sort of sense.

Nextgrid

It's backpedaling but I don't think it's planning ahead to prevent a developer shortage - rather it's pandering to the market's increasing skepticism around AI and that ultimately the promised moonshot of AI obsoleting all knowledge work didn't actually arrive (at least not in the near future).

It's similar to all those people who were hyping up blockchain/crypto/NFTs/web3 as the future, and now that it all came to pass they adapted to the next grift (currently it's AI). He is now toning down his messaging in preparation of a cooldown of the AI hype to appear rational and relevant to whatever comes next.

seg_lol

"We were against this all along"

mattgreenrocks

The party line will be: “we always advised using it if it as long as it helps productivity.”

Pointing out that it wasn’t always that will make you seem “negative.”

null

[deleted]

burningChrome

Agreed.

Considering the talk around junior devs lately on HN, there's way too many of them, it would indeed be amusing.

raincole

> changing their major

To what?

ok123456

So he's saying we should be replacing the seniors with fresh grads who are good at using AI tools? Not a surprising take, given Amazon's turnover rate.

epolanski

My experience is that juniors have an easier time to ramp up, but never get better at proper engineering (analysis) and development processes (debug). They also struggle to read and review code.

I fear that unless you heavily invest in them and follow them, they might be condemned to have decades of junior experience.

tayo42

> but never get better at proper engineering (analysis) and development processes (debug). They also struggle to read and review code.

You can describe pre-ai developers and like that too. It's probably my biggest complaint about some of my Co workers

PartiallyTyped

I have the same experience.

In my view there's two parts to learning, creation and taste, and both need to be balanced to make progress. Creation is, in essence, the process of forming pathways that enable you to do things, developing taste is the process of pruning and refining pathways to doing things better.

You can't become a chef without cooking, and you can't become a great one without cultivating a taste (pun intended) for what works and what it means for something to be good.

From interactions with our interns and new-grads, they lack the taste, and rely too much on the AI for generation. The consequence is that when you have conversations with them, they straggle to understand the concepts and tools they are using because they lack the familiarity that comes with creation, and they lack the skills to refine the produced code into something good.

fire2dev

> A company that relies solely on AI to handle tasks without training new talent could find itself short of people.

I kind of agree with this point from the perspective of civilisation.

itissid

I gave opus an "incorrect" research task (using this slash command[1]) in my REST server to research to use SQLite + Litestream VFS can be used to create read-replicas for REST service itself. This is obviously a dangerous use of VFS[2] and a system like sqlite in general(stale reads and isolation wise speaking). Ofc it happily went ahead and used Django's DB router feature to implement `allow_relation` to return true if `obj._state.db` was a `replica` or `default` master db.

Now claude had access to this[2] link and it got the daya in the research prompt using web-searcher. But that's not the point. Any Junior worth their salt — distributed systems 101 — would know _what_ was obvious, failure to pay attention to the _right_ thing. While there are ideas on prompt optimization out there [3][4], the issue is how many tokens can it burn to think about these things and come up with optimal prompt and corrections to it is a very hard problem to solve.

[1] https://github.com/humanlayer/humanlayer/blob/main/.claude/c... [2] https://litestream.io/guides/vfs/#when-to-use-the-vfs [3] https://docs.boundaryml.com/guide/baml-advanced/prompt-optim... [4]https://github.com/gepa-ai/gepa

NewJazz

I'm not sure a junior would immediately understand the risks of what you described. Even if they did well in dist sys 101 last year.

klipklop

I believe the idea is to not stop hiring juniors. Instead it's to replace anybody that commands a high salary with a team of cheaper juniors armed with LLM's. The idea is more about dragging down average pay than never hiring anybody. At least for now.

stockresearcher

And then all those unemployed seniors with extensive domain knowledge use AI to speedrun the creation of competition and you need to spend $$$$ to buy them out and shut them down. Solid idea.

alecco

Meanwhile:

"Amazon announces $35 billion investment in India by 2030 to advance AI innovation, create jobs" https://www.aboutamazon.com/news/company-news/amazon-35-bill... (Dec 9 2025)

la64710

This is for data locality.