I am just sooo sick of AI prediction content, let's kill it already
64 comments
·November 19, 2025agentifysh
The only prediction that I think is robust is: Those who use AI as tool today will replace those that aren't tomorrow.
Same situation with internet, we saw a bubble but ultimately those that changed their business around it monopolized various industries where they were slow to react.
Some jobs will be replaced outright but most will use AI tools and we might see reduced wages/positions available for a very long time coupled with economic downturn.
bigstrat2003
> The only prediction that I think is robust is: Those who use AI as tool today will replace those that aren't tomorrow.
That's not a robust prediction. Many people who don't use AI today simply don't do so because they've tried it, and found it subtracts value. Those people will not be replaced tomorrow, they will merely reevaluate the tool and start using it if it has started to add value.
criley2
If jobs were based on self-perceived value addition there would never be a layoff ever
Your executive team is going to "remove" non-AI folks regardless of their claims about efficiency.
Just like they forced you to return to office while ignoring the exact same efficiency claims. They had realestate to protect. Now they have AI to protect.
xoac
I upvoted both posts
moron4hire
"Now they have AI to protect," you're ultimately talking about corporate leadership being susceptible to the sunk cost fallacy here. But AI investment is a particularly easy sunk cost to get out of compared to real estate. Real estate has lease obligations and moving costs; it will cost you a lot more in the short term to get rid of your real estate. AI you could just stop using and immediately see cost reduction.
IncandescentGas
Identifying the 1% of ai use cases that are useful and refusing to have your attention stolen by the 99% that is mild melting garbage will be the key ai skill for the ai future
Dilettante_
So same as with the internet
AuthAuth
AI tools will get easier and easier to use. There will be no skill level required to use them effectively.
frde_me
Has this been true for any technology, ever? In that the curve of skill to output quality will be completely flat?
I would be suspicious of this claim.
AuthAuth
We can already see it. Take a look at AI image generation a few years ago. People were creating complex prompts, tweaking values, adding overlay models, stringing together several AI tools to get to a decent output. Now you can get a better result typing a simple phrase into one of the many major AI web interfaces. Tools like adobe have simplified all these features to the point where they can be learnt in under 5mins.
This is only going to be the start once AI gets good it will be so easy to use I doubt there will be any human unable to use it. Its the nature of natural language queries and companies working to build a model that can handle "anything" thrown at it.
yesfitz
Skill to effective output quality.
I'm sure there are people who are more skilled at using a cell phone than I am. It doesn't matter.
Similarly, we all have had co-workers or friends who aren't very good at using search engines. They still use them and largely still have jobs.
Now that I think of it, most regularly-used technology is like this. Cars, dishwashers, keyboards, electric shavers. There is a baseline level of skill required for operation, but the marginal benefits for being more skilled than the baseline drop off pretty quickly.
nosianu
> Has this been true for any technology, ever?
Yes?
Try abacus, slide rules or mechanical calculating machines vs electronic calculators.
Or ancient vs modern computers and software. They didn't even have "end-users" like we understand them now, every computer user was a specialist.
Programming.
Writing. Quill vs. ballpen, but also alphabets vs what you had to write before.
Photography, more than one big jump in usability. Film cameras, projectors/screens.
Transportation: From navigation to piloting aircraft or cars. Originally you had to be a part-time mechanic.
Many advanced (i.e. more complex than e.g. a hammer) tools in manufacturing or at home.
mgrandl
Does everything always need precedence?
HDThoreaun
The internet was the same. Didnt stop legacy businesses from getting their lunch eaten by internet native companies.
zeroonetwothree
Some businesses survive without using the internet so this isn’t the strongest argument. And even more use it minimally, eg they just have a Yelp page or something.
dmitrygr
Just like internet... completely safe to use, no malicious downloads, no scams to spot, totally safe and easy to use with no skill...oh wait...
sodapopcan
That's not what they're saying.
The easier "AI" gets to use (as it is being "promised" it will), the quicker a skilled engineered is going to be able to adapt to it whenever they give up and start using it. They'll likely be ahead of any of those previous adopters who just couldn't resist the allure of simply accepting whatever is spit out without thoroughly reviewing it first.
righthand
Have you considered the inverse?
Those who use AI as tool today will be replaced by those that aren't tomorrow.
angst_ridden
Those who know how to fix the messes made by AI today will replace those who don't tomorrow.
darkmarmot
This is what I'm seeing currently at work. YMMV.
risyachka
Everyone uses or will use ai, there is no learning curve so this is not an advantage
eloisant
Yes there is, for coding for example you need to learn how to use the tools efficiently otherwise you'll get garbage... And end up either discarding everything and claiming AI is crap, or push it to prod and have to deal with the garbage code in prod.
shkkmo
> Those who use AI as tool today will replace those that aren't tomorrow.
Unless they let their skills atrophy by offloading them to AI. The things they can do will be commodified and low value.
I suspect there will be demand for those who instead chose to hone their skills.
stego-tech
There always has been, thus far. When I was attending CC for an A+ class in High School, my lab partner was a woman in her early 40s who pulled down a staggering amount of money doing COBOL programming. I learned first hand that for every advancement in technology, there will always be folks who (rightly or wrongly) find no value proposition in upgrading needlessly.
AI as it presently stands is very much one of those things where in the immediate, sure, there’s money to be made jumping on the bandwagon. Even I keep tinkering with it in some capacity from an IT POV, and it has some legitimate use cases that even surprise me sometimes.
However, I aim to build a career like the COBOL programmer did: staying technically sharp as the world abstracts away, because someone, somewhere, will eventually need help digging out of a hole that upgrades or automation got them into.
And at that point, you can bill for the first class airfare, the five-star hotel, and four-figures a day to save their ass.
ronsor
I think if you have that problem, then you're not using AI as a tool; AI is using you.
Using AI as a tool doesn't mean having it do everything; it means you have the skill and knowledge to know where and how you can use it.
rileymichael
> it means you have the skill and knowledge to ...
sure, but in the real world the overwhelming majority of people loudly proclaiming the benefits of AI don't actually have the skill or knowledge (or discipline) to do so / judge its correctness. it's peak dunning-kruger
wslh
Using AI as a tool is similar to using a search engine and specific sites in the past. People are using it naturally (for what it works)
iLoveOncall
> The only prediction that I think is robust is: Those who use AI as tool today will replace those that aren't tomorrow.
And I make the inverse prediction.
I work for a FAANG and I see it, from juniors to senior engineers, the one who use AI generate absolute slop that is unreadable, unmaintainable, and is definitely going to break. They are making themselves not just redundant, but an actual danger to the company.
Avicebron
I pretty much agree but using "As an AI engineer myself" or a variation of that in your blog post should get you ridiculed. Who exactly are you trying to impress/differentiate yourself from?
nathan_compton
> The worse thing about this parasitic trend is that most of the time it’s basically a dude who wants to appear visionary and so he makes a prediction of the future.
This is basically an entire genre of low effort Hackernews posts.
saltcured
The ones making grandiose predictions or the ones making broad, mildly cynical dismissals?
:-)
mwhitfield
Or the twitter account of any VC
Seattle3503
> Now, I should clarify: I am not against talking about the impact of AI. It is a truly transformative technology after all.
This is how I feel. You see so many articles prognosticating and living in the world of hypotheticals, meanwhile AI is transforming industries today and article tracking those changes feel rare. Are they on an obscure blog?
sodapopcan
So long as you are enabling folks to pump out AI garbage, I'll be pumping out my garbage predictions, thank you much.
hyperhello
The AI writes the AI prediction content. It can’t give you any new information.
jeswin
This has already been discussed so many times. No good discussion will come out of this - and it'll just be people moping.
There's much better content on Show HN, one of which won't hit the homepage because this has more votes. It's a problem that HN has to fix - people upvote because they agree, and that vote carries the same weight as another which required far more effort (trying a product, looking at code etc).
hollasch
The best action in a reply-all storm is to send a response to everyone pleading for them to stop replying all.
raincole
> It appears we have become the LLMs
Always have been.
Anyway, complaining about them doesn't add any value either. And complaining about complaining... well you get the idea.
paulpauper
The biggest growth industry in AI is people doing podcasts and writing blog posts about the implications of AI or predictions of AI. It seems like >90% of articles from major media sources mention AI somewhere.
dinobones
I’ve felt the same. Also the AGI outcome for software engineers is:
A) In 5 years no real improvement, AI bubble pops, most of us are laid off. B) In 5 years near—AGI replaces most software engineers, most of us are laid off.
Woohoo. Lose-lose scenario! No matter how you imagine this AI bubble playing out, the musics going to stop eventually.
aj_hackman
The glee I see in many people over this possibility is quite chilling.
zeroonetwothree
All bubbles eventually pop. But it doesn’t mean we end up worse off than before.
This stuff is like the monster in The Blob. The more energy you direct at it, the bigger it gets. So your post, and my comment are just feeding it.