Career Advice in 2025
184 comments
·March 15, 2025Zee2
r_singh
It’s important to understand how AI will affect your field and recalibrate your position or contribution accordingly
It is a big enough change for this to be a valid question for anyone in the world today
Leaving what you’re doing and going into “AI” will likely set you up for a crypto level disaster
Vibe coding is a thing but vibe business building or job hunting isn’t! So beware of hype and know that in the end money is made by serving people and it will be equally hard with vibe coding too because the bar is higher
AI will create newer opportunities for sure but follow the opportunity, not the AI is what the sentiment here is I guess
everdrive
>It’s important to understand how AI will affect your field and recalibrate your position or contribution accordingly
In the case of my industry (middling cybersecurity) we're seeing the following "advances"
- When you ask someone a question, they vomit your question into co-pilot, paste the result, and presume that they have helped somehow.
- All meetings now have not-useful meeting notes and no one reads these.
- People are considering implementing security co-pilot, which will introduce useful advances such as spending much more time building promptbooks so co-pilot can understand our logs.
- A lot more people think they're engineers, and vomit out scripts which do things the "authors" do not anticipate.
asa400
>- When you ask someone a question, they vomit your question into co-pilot, paste the result, and presume that they have helped somehow.
We have been dealing with this at my job also. It's really concerning how this is becoming normalized and how often we've had to deal with it. Somehow there are people that have "Engineer" in their title that think this is acceptable workplace behavior and work product for a professional making $XXX,XXX/year.
We had a person join our team recently who doesn't know our stack at all (which is fine, we were happy to teach them). When another engineer reviewed their pull request and asked a question, they pasted the question into Copilot and responded to the pull request with the answer (which was wrong!), even going so far as to say "Copilot thinks it's this: ...". I almost lost it. Your job is to learn, understand, and apply that knowledge, not paste incorrect model responses back and forth between web forms!
It's baffling and enraging. Are people _trying_ to demonstrate to management and their teammates that they're actually worthless? Are our expectations as a profession really this low, that we don't expect people to understand the code that they push?
Aeolun
> The author doesn't seem to interrogate this assumption.
Neither do my senior leaders, so it might as well be true.
asa400
Our senior leaders have also been completely captured by this crap. Recently our CTO (public company in the US you've heard of) announced in chat that engineers with an aversion to relying on LLMs have an attitude problem that is incompatible with our company direction. I was blown away.
AnimalMuppet
Only in the short term. In the medium to long term, false assumptions will kill a company. As an employee, you would be better off recognizing it before the crunch hits.
Aeolun
Nah, we have enough captured business that I doubt it’d make a difference. It’s also not actively terrible for the customer, it just doesn’t bring anything to the table for the use cases we’ve used it for.
Then again, maybe it’s good to give people some experience with it even if there’s no real reason to use it right at this moment.
__loam
Think about what you're asking when you tell these people to interrogate the assumption that LLM based AIs are going to be the dominant technology going forward. Hundreds of billions, the growth of the technology industry, the entire US stock market, and the global economy has been wagered on this technology. Imagine the turmoil when those in power realize the reality of what they're betting the farm on.
bayarearefugee
The next time I am angrily typing to claude 3.7 in all caps because he overengineered a bunch of code I didn't even ask him to write in the first place, I'll be sure to let him know his continued failures are risking the entire world economy.
Workaccount2
I think SWE's have a serious blind spot here. I use the (rough) analogy of bowling to help illustrate this.
People need to knock over pins in the bowling lane. SWE's are the pro bowlers who can (usually) throw pretty cleans shots to knock over pins. Now bumpers have been invented (LLMs) and regular folks who only have the faintest idea of how to roll a ball are knocking over pins. To the pro's these bumpers are all manner of bad and useless. To the laymen, they are an absolute revolution.
I can tell you, with a straight face, the my (non-tech) company has already forgone hiring a pro bowler in at least four instances now because of these bumpers. Just last week we skipped on a $1k/mo CAD SaaS because Claude was able to build the needed narrow-scope tooling in 10 minutes.
I'm sure a pro could come in and make that python program 3x as fast and use 60% less memory. But the fact of the matter is that we paid Anthropic $20, and spent 10 minutes to get a working niche manufacturing file interpreter/editor/converter.
LLM's are finally bridging the language barrier between computers and humans. Right now the tech exists to make this even more widespread, it's just a matter of time before someone creates a tech-illiterate IDE that users can paste AI generated code into and functioning programs come out the other side. No need to ever even see a terminal or command line. I wouldn't be surprised if this isn't already in the works.
"Hey Google, create an app that allows me to take a picture of a house plant, and then allows me to verbally make entries into a diary about that plant" Sure thing! Give me 3 minutes and the app will be on your homescreen and shareable .apk in your documents folder! I'll also cancel the $9.99/mo app that does the same thing for you. (ok probably not this part but you get the idea.)
__loam
You can be as snarky as you want but the reality is we're years deep into a market cycle that has seen a tremendous amount of capex with very little visible return.
How much more productive do you think claude makes you as compared to Google or Stack Overflow? 15%? 50%? 200? Do you think that's enough to satisfy the market or are we all trading on unrealistic expectations? Do you think shareholders are going to like it that they're losing billions a quarter so Anthropic can run a service that helps you write web dev projects marginally faster? Do you even understand the amount of value that's tied up in these questions having a good answer right now?
coffeefirst
Yeah... unfortunately that might be where we are.
I have no analogy for this except the railroads of the Gilded Age. Did railroads become a pretty big deal? Yeah. They were also a giant vortex that slurped up endless investment, far more than the real demand could possibly justify. And it ends, well, we know how it ends.
schnable
Fiber optic and data center build out in the nineties is similar. Overinvestment led to a bust for a period, but the infrastructure was useful and provided the foundation for the next wave of Internet growth. LLMs could be similar.
roncesvalles
We've become so accustomed to the rip-roaring growth that came from widespread Internet adoption and now that it has piped down, we're desperate to find the next big boom. VR, crypto, blockchain, generative AI. And each time, like degenerate gamblers, we're feeling it, this must be it, the next Big 'Un, the bet that redeems all the bets that went wrong, bigger, riskier, bigger, riskier.
But it just won't be, nothing in our lifetime will ever come close to what the Internet boom was. The window for becoming a Jeff Bezos or Mark Zuckerberg as easily as they did is closed now, and you just need to live with it. The title of this chapter till the end of our days will remain "After the Internet Boom" and it will chronicle this pathetic desperation.
schnable
I agree with the sentiment, but LLMs have already had a lot more adoption than VR and Crypto.
makeitdouble
> Hundreds of billions
Yes
> the growth of the technology industry
That's an overblown claim. AI companies failing won't mean technology doesn't advance nor that companies betting against/independently from AI would recind.
> the entire US stock market
Probably yes
> the global economy
Probably no
shmerl
It wouldn't be the first bubble. That doesn't make the above point about questioning it incorrect.
intelVISA
Don't be silly, if the entire US stock market really was wagered on this technology then you'd better start learning Mandarin.
npodbielski
Exactly my thoughts reading this article. Luckily if next few years we will have thousands of projects written using 'ai' there will be need for someone to debug and fix all of that broken software.
Or maybe not, maybe it will be cheaper to just slap another ten k8s pods to mitigate poor performance...
n_ary
I believe, we are beyond the point of “bad software written, bad software deployed, business as usual” point long ago when AWS/GCP/Azure became an important requirement in job description.
A bad piece of software can be decently hidden by burning more money in cloud bills, which gives the inflated sense to the leadership that their products are doing global scale ground breaking.
With AI, I would not be surprised if the quality actually improves and the cost comes down(or stays same). Of course, more bad software will be written by now many aspiring entrepreneurs to realize their dream idea of spotify clone, then sacrificing their life saving on complex cloud bills and ever so profitable rise of revenue of all cloud services citing this as benefit of AI while doing some more layoffs to jack up the stock prices.
The real revelations will come(it always does, nature and economy works in cycles), when excessive layoff caused damage will come due and now everyone will scramble to rehire people in few years. Unlike the Ford innovation of replacing horse carts, software is more prevalent in our every aspects of life, same as doctors and lawyers and civil service, hence we need to honestly play the game until the wave turns and then cash in by making in 200x killing just like the businesses are cashing in on right now.
KronisLV
> I believe, we are beyond the point of “bad software written, bad software deployed, business as usual” point long ago when AWS/GCP/Azure became an important requirement in job description.
> A bad piece of software can be decently hidden by burning more money in cloud bills, which gives the inflated sense to the leadership that their products are doing global scale ground breaking.
Doesn't this apply to almost all software out there nowadays?
Bloated enterprise frameworks (lots of reflection and dynamic class loading on the back end, wasteful memory usage; large bundles and very complicated SPA on the front end), sub optimal DB querying, bad architectures, inefficient desktop and mobile apps built on web technologies because of faster iteration speed, things like messing up OS package management where it's not easy to halt updates and they don't integrate well with the rest of the system (e.g. snap packages), messy situation with operating systems where you get things like ads in the start menu or multiple conflicting UI styles within it (Windows), game engines that are hard to use well to the point where people scoff just hearing UE5 and so on.
Essentially just Wirth's law, taken to the maximum of companies and individuals optimizing for shipping quickly and things that catch attention, instead of having good engineering underneath it all: https://en.wikipedia.org/wiki/Wirth%27s_law
Not the end of the world, but definitely a lot of churn and I don't see things improving anytime soon. If anything, I fear that our craft will be cheapened a lot due to prevalence of LLMs and possible over-saturation of the field. I do use them as any other tool when it makes sense to do so... but so does everyone else.
npodbielski
> With AI, I would not be surprised if the quality actually improves and the cost comes down(or stays same). Of course, more bad software will be written by now many aspiring entrepreneurs to realize their dream idea of spotify clone, then sacrificing their life saving on complex cloud bills and ever so profitable rise of revenue of all cloud services citing this as benefit of AI while doing some more layoffs to jack up the stock prices.
At this point we all speculating really. But from logical point of view, LLMs are trained on code written by humans. When more and more code will be written by LLMs instead, models will be trained on content written by other models. It will be very hard to distinguish which code on Github was wrote by human or some model (unless the quality will differ substantially). If this will be the case I would say that quality of code written by them will drop. Or the quality of models will drop. Or code written by model will be still using the pre-LLM patterns, because model-written code will not be part of training data. It may be that LLM written code will be working but hardly comprehensible for human. For now models does not have negative feedback loop that humans have ('oh code does not compile' or 'code does compile but throws an exception' or 'code compile and works but perform poorly').
Anyway, I am sure that there will be impact to the whole industry, but I doubt models will be primary source of source code. Helpful tool for sure but not a drop-in replacement for developers.
zeckalpha
He's describing the current market, not speaking about how the market should be.
SwtCyber
I do think some transitions are inevitable and not because AI must be used, but because once enough companies figure out where it genuinely improves efficiency, the competitive pressure to follow suit becomes real
pydry
It likely wont be from where you expect though.
It isnt always a bad idea to wait until trendlines are clear.
LLMs will not go away but it's still not at all clear what skills investment should be made to respond to that.
Ive learned a ton about obsolete tech in the past and ive even learned a ton about LLMs in the last year that ended up becoming obsolete.
randomNumber7
The same holds true for software developers imo. If you can't figure out how to use LLMs to improve efficiency, your likely a dinosaur of the past soon (unless you work on somethink __very__ specific where LLMs dont help much).
dingnuts
I can barely think of any real application where they would help. I have a weekend project that's already too much context -- I asked Claude to change some Tailwind styles for me and it just shat up the whole file. and that was a toy!
even if allowed, how is Claude going to help my at work where a single file in a large project, one of many, is tens of thousands of lines long?
guess I'm a dinosaur
apwell23
This guy probly doesn't doesn't do any actual work just reads ppl like Mario Damadai who are out there claiming 90% of coding will be done by llms in next 3-6 months.
"thought leaders" are a plague on working ppl.
simonw
Will wrote one of my all-time favorite essays about software engineering: https://lethain.com/migrations/
He's worth paying attention to IMO.
stickfigure
I just read this for the first time and was totally underwhelmed. What is the takeaway? "Derisk, Enable, Finish"? This is not insightful or even interesting.
Apocryphon
Is he going to update it with how AI will assist in migrations? Or get an LLM to ghostwrite it?
dalyons
This was excellent and rings very true, thank you
greymalik
An excerpt of the author’s bio:
> I’m a software engineering leader and writer, currently serving as Carta’s CTO. I’ve worked at Calm, Stripe, Uber, Digg, a few other places, and cofounded a defunct iOS gaming startup
acheron
Is this supposed to be an endorsement or an indictment?
apwell23
oh look another out of touch 'leader' . So sick of these ppl.
His career advice should be how to get your work done while appeasing "leaders" like him at work.
my company now has mandate that 'all coding work must be done by AI' and only manually if its not possible. They bought licenses to all AI coding tools.
Which would've been great if these things actually work. I've never felt more like a stupid cog in my whole career than now.
AIFounder
[dead]
tibbar
Some more free advice, in no particular order:
* Try to get at least one job offer every year, even if you don't accept it.
* Look at the requirements for your dream job and figure out what you need to learn to qualify.
* Pick one skill and get very good at it. Spend an hour a day on it for a year.
* Steer away from skills like web development that are clearly getting eaten by LLMs.
* Look for work in major U.S. tech hubs like the Bay Area. Pay is better and network effects are strong, so your next job will be easier to get.
n_ary
> Steer away from skills like web development that are clearly getting eaten by LLMs
On the contrary, here in my corner of EU nearly 60% of new jobs are frontend or full stack.
Anything else left is mostly SAP consultant or DevOps.
I think, the whole WebDev is dead end is just false lies to dissuade new entrants. Literally most successful business with a digital solution is web stuff with some automation that would otherwise be ms excel sheets shared via email.
Also this whole panic over LLM is overblown, I know of some brilliant experienced people in other professions like Electronics, Mechatronics, Aerospace, Material Science and literally all of them are finding the job market “very difficult at the moment”. It is the bad global mood in general used deceptively by opportunists to spread false fears of their LLM/AI.
At the end of the day, an insurance seller has hundreds of concerning reasons to convince you why everything is dangerous around you and you really need their product. Now apply that to AI sellers.
itake
Web element feel similar to PHP or wordpress development in 2010.
There are millions of small businesses that demand wordpress websites, but the barrier to entry to support and build systems got very low very quickly. Professional developers were competing with high school students and offshore devs for work.
As a backend Developer, I can now build websites easily with Claude and react. I think web development, especially front end, will be like knowing HTML and CSS in 2015. Like everyone should know it, and thus not even worth putting on your resume.
only-one1701
Genuine question: why do you think LLMs will be able to handle frontend development in a way they won’t be able to handle backend development? I assume you’re talking about real companies, not toy projects or websites for restaurants or whatever.
tibbar
There's another factor here I forgot to mention - web development, as a specialization, tends to be paid less and has a lower career ceiling in many companies than backend and infra engineering. This is a personal observation on my part but I've seen many other people remark on this. True full stack engineering, I think, is reasonably safe from the robots at the moment.
If someone likes building products, I'd basically recommend that they not go 100% full-bore on frontend engineering, definitely go for "full-stack", and accept that a lot of frontend code is trivia that you can just ask the LLM for these days. I would also recommend that they develop solid product management and UX skills.
dakiol
Anecdotally, I work on backend/cloud and my senior frontend engineers do earn not as much as I do, but close… and their environments are always less stressful (major outages are not caused by frontends usually and when so, reverting to the last stable commit is enough since frontends are stateless; their toolset is narrower than mine, so yeah they need to jump between frontend frameworks but that’s fine… me in the meantime need to jump between backend frameworks, dbs, k8s, distributed system knowledge, unix tooling, OS/TCP/IP, etc)
ativzzz
> a lot of frontend code is trivia that you can just ask the LLM for these days
If you're building CRUD yes. If you're doing anything remotely complex or novel, LLMs fall apart, much like they do for complex backend tasks. There's a lot of cool, highly customized stuff you can build with JS, and LLMs don't do a very good job at general problem solving.
They're still helpful for writing small, specific functions, or providing high level guidance
eastbound
Certainly. Front-end and UX skills go together. Front-end dev as a beginner → UX → PM later, would provide the same salary as back-end dev -> DevOps / K8s.
sangeeth96
> Steer away from skills like web development that are clearly getting eaten by LLMs.
You had it going well there but then had to ruin it with a take like that.
If you’re talking about your trivial, github-filled example scenarios of frontend, sure. But then we could say the same for all other roles, including backend logic that’s regurgitated all the time.
Like with everything else, the non-trivial bits need work and skills.
wiseowise
Except that they’re right, a your “non-trivial bits” will like 5% and rest will be dealing with idiots copy pasting AI slop.
sangeeth96
I'd personally try to stay away from anything built this way, if I can get a whiff of it (and it's not super hard to find the slop right now). But, even if learn to agree on this for the share of throwaway apps or internal tools that might not need sophistication or care, my point was that the same take applies to the backend side as well and I'd argue that's a more at-risk domain since data is usually in a serializable format to be fed into LLMs and doesn't have the challenges of visual input.
breput
> Look for work in major U.S. tech hubs like the Bay Area. Pay is better and network effects are strong, so your next job will be easier to get.
Jobs and the network effects happen all across the country. As you get older and maybe don't want the grind, or have a family, or just want a better work/life balance, this will become apparent.
Basically, always have at least two people who will support you for your next job.
tibbar
There are plenty of tech hubs besides the Bay Area, that's for sure. But I can tell you that when I moved from a small company in a small economy to a moderately-well known startup in the Bay, the rate at which recruiters contacted me jumped from maybe a few times a year to multiple times per week. And after a few years, many of my coworkers started their own companies and invited me to join them.
By contrast, I have very talented friends who did not make the jump to work at a tech hub, and they don't have the same kind of network or opportunities.
With that said, I very much agree with you about wanting work life balance, making sure there are people who will support you in your next job, etc. However, I think that this is much easier to optimize for when you do have an established career and an extensive network already.
breput
I didn't mean to nullify your experience.
You've undeniably right about how there have been spheres of influence where people and capital come together. Every area of the United States has some kind of forced name of "Silicon *" for a reason.
I just don't think this will be true in the future. Move where you want to, maybe have to work harder to break through, but that was also true in the Valley.
roland35
Good advice in general. I would add - try to pick a skill which gives you a deep understanding in something fundamental which will always be relevant, rather than a particular shiny tech.
I would love the bay area, but unfortunately it is extremely inaccessible for me and others once you have a family. Trying to find a place to live in a good school district seems like it takes minimum $2M for a house. Renting is less long term secure when trying to maintain consistency for kids. That's not even get into earthquakes and wildfires!
titanomachy
If you’re usual child-raising age then you probably have 6-10 years of experience, and there should be lots of jobs that pay $500k in the Bay Area. Buying a $2m house on that salary is pretty doable.
roland35
$500k would likely be a staff level salary offer. I think a senior role around $350-400k is probably more likely as a new hire coming in which is definitely great money, but still hard to take a $20k+ monthly mortgage with! Especially since a lot of that salary is variable equity income which can go down (ask me how I know!!)
scarface_74
Amazingly enough, I and millions of other developers have managed to find jobs outside of the Bay Area.
Personally, I’ve been finding jobs quite easily 10x since 1996 - the last 2 in 2023 and last year.
> Look at the requirements for your dream job and figure out what you need to learn to qualify.
Those jobs in the Bay Area mostly require you to “grind leetCode” and system design. They really don’t require you to know the latest frameworks, databases, Kubernetes, etc
tibbar
Hey, as I said in another thread, I did not start out working in the Bay, and ended up here somewhat by accident. It shocked me how much easier it was to find good jobs here, and I'll stand by that. With that said, of course I say this with no judgment to anyone not in the Bay or other tech hubs, it's friendly advice from personal experience.
> Those jobs in the Bay Area mostly require you to “grind leetCode” and system design. They really don’t require you to know the latest frameworks, databases, Kubernetes, etc
Hmm, it sounds like you have a negative opinion of Bay Area jobs in general. I'm asking people to first figure out what work sounds interesting to them, and then learn the relevant skills. If you have the skills, the Bay Area probably has the right job, too. Of course these jobs also exist elsewhere, I'm not sure why I'm triggering this reaction...
scarface_74
I’ve worked for BigTech. If you look at how to pass any of the interview processes, it’s basically all about generic coding interviews.
That’s it, those are the only “skills” you have to have to get into the BigTech - pass coding and system design interviews.
And there are thousands of developers looking for jobs - even those who are coming out of well known tech companies.
yolovoe
> They really don’t require you to know the latest frameworks, databases, Kubernetes, etc
The latest "framework, databases" are constantly changing. Being good at leetcode and system design is a better signal (ofcourse, not perfect) than knowing specific tools.
Being good at system design implies you are aware of tradeoffs across various systems, and that coupled with willingness to grind means you can at pick up new tools and probably deliver on projects. I have used 13 languages and an equally absurd amount of tools across 4 orgs in my 5 YOE at FAANG. It's constant learning, or you're out basically. Doesn't make sense to quiz on anything specific. The interview process is quite fair actually.
margalabargala
System design yes, leetcode no.
Leetcode is only a useful problem to ask if the candidate has not encountered that problem before and has not practiced leetcode. Otherwise it is exactly as good a signal as knowing some arbitrary framework or database.
DeathArrow
>Steer away from skills like web development that are clearly getting eaten by LLMs.
What do you mean by web development?
Would backend qualify? Would microservices qualify?
A complex application is much more than coding.
tibbar
I'm mostly thinking of frontend dev work, but also some types of light backend work that you might see in a CRUD app. And listen, I've done that work, I've done a lot of it, and I saw that it's mostly a career dead-end, becoming more and more automated/copiloted away. It's not a career moat to be a mid-level React developer. By contrast, some things I think are worth pivoting into include infrastructure, databases, data engineering, stats, etc, and I've spent the last few years pivoting into those areas.
An interesting counter-point is that if you have great product and design skills, this is a great time to learn frontend development, because it's more accessible than ever and can supercharge your existing skills. But the days of being a pure frontend coder are probably fading.
weatherlite
> By contrast, some things I think are worth pivoting into include infrastructure, databases, data engineering, stats, etc
I can't imagine why if LLMs can magically solve web development (which can be complex as hell depending on the app) they wouldn't be able to solve infrastructure, database or data engineering. I somewhat agree that our career moats were hurt (though not as much as you seem to believe in my opnion) but that's happening across the board.
fifilura
I'm my view, the core of front-end work was always the UI "User Interaction".
Yes some can be solved by designers, but I believe there will still be a big market for designer-programmers.
The programmers that understand design, interaction, pixels and colors will still be of great value.
But if you don't really care about how stuff looks or can't tell a difference between an animation at 25fps vs 50 fps, it is a good sign it is time to try something else.
AI will simply refine your skillset. A backend programmer will have more time to think about architecture, a data engineer/scientist will have more time to think about maths. Or in essence "what am I trying to achieve". And it is up to you to step up to it.
I rather think of this generation with hordes of "coders, writing code" as an anomaly.
csomar
Front-End interfaces can be as complex (or more complex) than Back-End/infra/analytics/etc... At the end of the day, it is all about data and the front-end needs to maintain state. If your interface is complex, your state will also be complex.
kristiandupont
>infrastructure, databases, data engineering, stats
Why would those areas be less exposed to the "LLM threat"?
krishnakanna18
> infrastructure, databases, data engineering
I work as a backend dev, without an opportunity to work on these at my current company. How do I learn and showcase these skills effectively. Thanks!
nextts
A good signal is you get pissed off with LLMs because they hinder your job. Even if you tried in earnest to use them.
fragmede
that sounds more like someone needs anger management classes than a problem with technology or LLMs
luxuryballs
this and why not use LLMs and become an even more productive web developer
noisy_boy
I see the evolution, atleast until things get drastic, to be where the interviewer will allow LLM use and observe how fast you can deliver the objective while dealing with hallucinations etc.
SwtCyber
On avoiding web dev: I get the concern, but I wouldn't write it off completely. LLMs are changing the game, but they're not replacing deep expertise in architecture, scalability, or understanding real-world business constraints
wutwutwat
I don't think they meant architecture, scalability, or understanding real-world business constraints
I think they meant literally web developers, or, aka, frontend folks doing html and css.
I've worked on backend systems that run web sites and apis, for over a decade, and I've never once referred to myself as a web dev. That title has always been frontend specific imo
dzonga
Good advice -> but is an error or blindness > * Steer away from skills like web development that are clearly getting eaten by LLMs.
funny enough all the hyped YC / Bay area startups don't make as much as your typical CRUD webapp. Devs we tend to be attracted to tech. but what makes good tech doesn't mean it's a good business. That's why your typical bay area startup depends on vc funding & will likely spend 10 years without being cash flow positive.
rakejake
"decision-makers can remain irrational longer than you can remain solvent" - Very correct. Whether or not AI actually comes for your job, the fact that enough people at the top think so is enough to cause trouble.
pjmorris
A non-technical friend was asking about the prospects of AI 'taking over' jobs. I told him that I'm less worried about 'Skynet' than I am about 'Slopnet', where bad takes on the applications of 'AI' just make life harder for all of us. That'll come more from decision-maker irrationality than from the tech itself.
from-nibly
This is the problem. Right now we're not in the, I need AI to work or get out stage. We're in the AI might completely upend reality stage.
It's just people telling stories to find bigger fools. Like the ads claiming they sell an AI employee that never needs sleep and never talks back.
Those ads are the same thing as those ads shoved in the lawn near the mcdonnalds drive through that look like they were drawn with sharpie, but are really mass printed. "Real estate investor looking for pupil, trade my money" kind of stuff.
They are purposefully looking for suckers that would overlook the sketchyness. They don't want normal people applying, that reduces the pitches effectiveness.
Only desperate people who will fall for anything.
faizshah
Same is true for remote work. All the engineers know the return to work policies are dumb but all the decision makers have decided we’re all wrong.
jes5199
then why don’t they co-locate teams when they get RTO’d? I keep hearing about people who have to go sit in a mandatory hot desk but are still stuck on Zoom all day. Seems like the worst of both worlds
klodolph
It’s ordinary corporate dysfunction. The mandates come top-down. People in management don’t think too hard about exceptions. The people making decisions are far-removed from the consequences of their decisions.
from-nibly
RTOs generally have nothing to do with any of the things they say. They are just layoffs.
You can't argue with them about the effectiveness of remote work. They aren't trying to optimize work. They are trying to fire people.
Working from home doesn't fire people, being more productive and happy doesn't fire people. Your mental well being doesn't have any bearing on how many people they need to fire.
__loam
Based on their own gut.
SwtCyber
Exactly. It’s less about whether AI can replace certain jobs and more about the fact that companies are making decisions as if it will. That alone reshapes hiring, budgets, and job security
nextts
We are still doing scrum-like stuff after all. And they are dragging people back to the office. Decision makers have billions at their disposal to be inefficient with.
adamtaylor_13
I started my own business last year that has happened to go quite well. As I’ve watched the software industry over the last year, all I can think is… “damn, what lucky timing.”
leetrout
I keep looking for what you claim to be doing. I feel like reality continues to smack me in the face with “no one cares about quality”. I have tried big enterprises. I have tried 8 startups in 18 years. I watch the leaders / founders make the same mistakes over and over.
Anyway - your profile resonates with me. Would love to grab a virtual coffee if you are up for it.
stogot
Could you share the mistakes they are making repeatedly?
leetrout
In no particular order:
Quality seems completely lost as a goal in any shop. See Demming and Peopleware.
In very general terms: "founder mode" is more often than not toxic because it is taken out of context and used to scapegoat being an asshole. Much like the Steve Jobs worshipping of yesteryear.
Lack of financial oversight by board/investors including founders putting up company money for personal investments including moving money into personal accounts to try and inflate their credit ratings.
Non-VC startup: lack of preparing for a future where income is reduced for a period of time and keeping some amount of business savings around
VC startup: lack of appropriate fund handling given the assumption that the money machine will always be there
Hurting all option holders by removing the ability to exercise early or ruining their exit with poor financial stewardship including down rounds or bridge rounds
Always time to fix it later vs taking the time to plan and execute correctly in the first place. This is a relative scale of time to market and risk to opportunity cost but with the exception of large events like tradeshows or such a few more weeks to get it right is better than hobbling along with literally millions of dollars of technical debt if success is found.
And re:tradeshows - demoware is the name of the game and sales needs to be onboard with eng that demoware is demoware and what to sell ahead of the product
Lack of clear execution plan as product market fit is found and everyone has been acting in "throw shit on the wall to see what sticks" mode
General lack of discipline to keep things orderly as work is done which is a failure to understand systems and second / third order effects. Speed is a lagging indicator not a leading one. Everyone wants more, faster but then fails to slow down to put process in place to facilitate moving faster.
Not growing teams appropriately. Lots of lack of empathy, poor hiring practices, abysmal firing practices. Lots of tolerance of "smart assholes" in small teams when leadership only cares about P/L and so the toxicity grows.
iancmceachern
I've been doubling down on mine as well rather than pursue a more traditional career path and I feel the same way.
cambaceres
Hi, can you please elaborate a bit?
axpy906
The blog confused me. The author opens with senior leaders but does not distinguish between ic and managers. That is a very important difference and the way they are written is somewhat interchangeable at the start but I don’t think that’s necessarily correct.
Managers and senior ics are both facing unique challenges now. However, they are very different and don’t have a lot in common.
For me, as a senior ic, it’s having the right skills and staying afloat in this challenging environment. For the middle managers, from what I can see, it’s not being redundant as Mark Cuban recently pointed out.
advael
I get why someone would want to tell people this, but I think it fails to be advice, as it describes the current state of affairs without much actual guidance for how to do better in it
Which, to be fair, no one really seems to be able to answer that meaningfully
n_ary
The real answer is, leaders chase trends, so to keep them satisfied you need to also chase trends.
Some advises can be:
* learn prompt engineering to impress your boss and next employer
* adapt AI IDE at work, of course don’t go for cline or freemium ones, go for max expensive tier of Cursor/Windsurf/v0 etc
* Take some expensive(more the better) courses and workshops on Agentic topics, build some small projects. Justify your expenditure by citing being more well prepared for AI transition and adapting to new paradigm to strategically beat the competition businesses who will be extinct without employees with such trainings
* build some proof of concept projects to convert the smaller trivial projects to use agentic workflow. Then show these to your boss and put these up on github for future reference
* learn to train smaller models on your own internal documents, build a chat interface on top and give access to your boss(trust me, they will be blown away and will sell this to their superior)
* seed fear in your colleagues’ minds by using AI stuff where possible
Think of this AI as a new trend we must adapt to, just like FE moved from jQuery to React. Life and work goes on, just this wave nudges the complacent bunch to finally get out of their comfort zone and learn something new or get left out.
tonyedgecombe
>seed fear in your colleagues’ minds by using AI stuff where possible
You mean make them terrified of debugging your code after you have moved on?
stogot
Watch your boss file you for training a model on internal documents without permission
Havoc
> seed fear in your colleagues’ minds
Jikes
thor_molecules
> Many people who first entered senior roles in 2010-2020 are finding current roles a lot less fun.
This resonates with me.
I find that the current crop of new tech (AI) produces a lot of cognitive dissonance for me in my day-to-day work.
Most initiatives/projects/whatever around AI seems to be of the "digging your own grave" variety - making tools to replace software engineers.
Definitely not fun.
SwtCyber
The shift in what's valued (moving from team-building and hiring prowess to sheer execution speed and adaptability) has been stark. It's also unsettling to see so many senior folks feeling -left behind- not because they lack skills, but because the rules of the game have changed so quickly...
diordiderot
> Steer away from skills like web development that are clearly getting eaten by LLMs.
There's a slim chance that frontend work runs into Jevons paradox.
Rather than churning out simpler interfaces faster, the complexity will grow.
More 3D sites, MR, etc
flamboyant_ride
> decision-makers can remain irrational longer than you can remain solvent
Feels definitely true. I wish ICs had much more agency. Wish there was something actionable even as a small step (which I don't think is there on the article).
Having been laid off due to unclear reasons, can't help but think the same could happen in my next prospective job (if I get one in this market that is) and there's nothing I could do about it.
null
jll29
> managers were generally evaluated in that period based on their ability to hire, retain and motivate teams. The current market doesn’t value those skills particularly highly, but instead prioritizes a different set of skills: working in the details, pushing pace, and navigating the technology transition to foundational models / LLMs. This means many members of the current crop of senior leaders are either worse at the skills they currently need to succeed, or are less motivated by those activities. Either way, they’re having less fun.
Don't geeks enjoy playing with LLMs much more than hiring or any other admin/people/meeting type interaction? They should find dealing with the new wave of AI success a lot of fun, and a lot MORE fun that governance stuff, IMHO.
The statement
>The current market doesn’t value those skills particularly highly, but instead prioritizes a different set of skills: working in the details, pushing pace, and navigating the technology transition to foundational models / LLMs.
depends on the assumption that technology must "transition" to "foundational models / LLMs". The author doesn't seem to interrogate this assumption. In fact, most of the career malaise I've seen in my work is based on the assumption that, for one reason or another, technologists "must transition" to this new world of LLMs. I wish people would start by interrogating this bizarre backwards assumption (ie., - damn the end product! Damn the users! It must contain AI!) before framing career discussions around it.
However,
>decision-makers can remain irrational longer than you can remain solvent
is unfortunately painfully true.