Skip to content(if available)orjump to list(if available)

Generative AI Hype Peaking

Generative AI Hype Peaking

111 comments

·March 10, 2025

o_nate

There's an old game in the investing world of trying to time the top of a stock bubble by picking out the most breathless headlines and magazine covers, looking for statements such as the famous 1929 quote from two weeks before the market crash: "Stock prices have reached what looks like a permanently high plateau." By that metric, we may be getting close to the top of the AI hype bubble, with headlines such as the one I saw recently in the NY Times for an Ezra Klein column: "The Government Knows A.G.I. Is Coming".

gh0stcat

The AGI piece from Ezra was frustrating, to the extent that after listening to him talk about technology in this podcast, it made me question the quality of his knowledge in domains I know far less about.

cyberlurker

Listening to his podcast on the topic was so disappointing. I think Ezra is a smart guy, but he doesn’t understand the field and the entire premise of the long discussion was that LLMs are going to get us to AGI.

mzronek

He also casually dropped that he talked to people at firms with "high amount of coding", who told him that by the end of this or next year "most code will not be written by humans".

Yeah, okay. I work each day with Copilot etc and the practical value is there, but there are so many steps missing for this statement to be true, that I highly doubt it.

My case is, wouldn't we already see the tools that are at least getting close to this goal? I can't believe that (or AGI in fact) to be a big bang release. It looks more like baby steps for now.

thfuran

I mostly write java, and I rarely even look at the byte code, let alone the output of C2. I guess AGI took my job.

DebtDeflation

I stop reading/listening as soon as AGI or Superintelligence is mentioned.

kurthr

[flagged]

layer8

Artificial Glitchy Intelligence

falcor84

For what it's worth, intelligent sexual robots will likely be massive in the next decade

breckenedge

This article is way too light on the details. Does it conflate Nvidia’s stock price with interest in generative AI? New use cases for it are arriving every month. 9 months ago I was amazed to use Cursor, and was leading getting my team to switch to it. 3 months ago it was that Cursor had added agents and trying to again demonstrate their benefits to my colleagues. Now I’m using Cline + Claude 3.7 and more productive than I’ve ever been — and I haven’t even touched MCPs yet.

Definitely not yet peaked IMO. However yea, I don’t see it fully replacing developers in the next 1-2 years — still gets caught in loops way too often and makes silly mistakes.

Etheryte

I would say the hype has started to fall off, as it's becoming increasingly obvious that AGI is not around the corner, but meanwhile practical use cases keep getting better and better. The more we understand the strengths and weaknesses, the better we can exploit them, and even if the models themselves have hit the scaling wall, I think tooling around them is far from done.

bwestergard

Thanks for your comment.

I am arguing the hype has peaked, and that there will likely be a pull back in investment in the next year. This is not to say the technology has "peaked", which I'm not sure one could even define precisely.

Important technologies emerged during each past "AI summer", and did not disappear during "AI winter". LISP is more popular than ever, despite the collapse of hype in symbolic reasoning AI decades ago.

As I mention in the OP, I think productivity enhancing tools for developers are one of the LLM applications that is here to stay. If I didn't think so, I wouldn't be concerned about the impact on skill development among developers.

https://en.wikipedia.org/wiki/AI_winter

breckenedge

Thanks for the clarification. IMO, the recent draw down in investment is political as people are seeing the US tech giants facing a more uncertain future under the current administration.

https://stocks.apple.com/AaYuqQyN_QVOYHhUY2G0IHg

Yoric

If my memory serves, both SQL and HTML are indirect fallouts from an AI summer.

alexpotato

Counter point:

The decline in NVDA stock price may also be due to newer models that require fewer GPUs, specifically from NVDA.

In other words, the demand may stay the same but if fewer GPUs in general or non-NVDA GPUs specifically get you to the same point performance-wise then the supply just went up.

dowager_dan99

this seems realistic with consideration of a lot of previous tech advancements. We focus on the disruption, but meanwhile the efficiency improvements follow closely (and often more easily) on the coat-tails. We should definitely be looking for more efficient production of what's already been proven, vs. the next big step happening immediately.

maxglute

Peaking hype = investers think generative AI may replace billions instead of trillions of economic activity in the return window they're looking at.

breckenedge

I believe it continues to get faster and better from here. We haven’t even scratched the surface of deployed capabilities yet. Sure it might not be a path to AGI, but it could still replace many people in many roles. It may not be a specific company’s silicon that wins, but generative AI is just getting started. Yes, Claude 3.7 and ChatGPT 4.5 are not as groundbreaking as previous iterations, but there are so many untouched areas ripe for investment.

MangoCoffee

AI is just a tool. Its not going to replace human coder anytime soon.

giantrobot

That's not what the C-suite is telling their boards/investors as they conduct layoffs to goose margins at the end of the quarter. So a lot of people are having their lives and livelihoods upended because of unrestrained hype.

yubblegum

> Nvidia’s stock price

Market may be pricing in possible takeover of Taiwan by China.

daedrdev

Stocks are down because the president of the US has entered a costly trade war, actually.

tenpies

What do you make of something like Reddit (RDDT, down -15% at this moment)?

It's unaffected by tariffs, but its insane valuation is driven by the narrative that Reddit posts can be used to train AI. Without that narrative, you have a semi-toxic collection of forums and the valuation would probably be somewhere in the millions at best, not the current $20 BB.

greener_grass

The companies that might acquire Reddit are affected by tariffs.

aetherson

I mean, not to say that you might not have some explanatory power here, but the market is complex and difficult to untangle, and at least some analysts are predicting recession which will certainly have effects on Reddit even if it's not directly affected by tariffs. We can all cherry-pick individual stocks.

loandbehold

Reddit is an ad-driven business. Ad revenues decline when economy shrinks.

mixmastamyk

When a correction happens, everyone with short-term funds pulls them out. Doesn't matter if the issue has a direct connection to the stock or "makes sense" at all.

bwestergard

No disagreement from me there. But for the year to date the Nasdaq composite is down less than 4%, whereas NVIDIA is down 20%.

YetAnotherNick

Nvidia is up 24% in last 1 year compared to <10% for Nasdaq or S&P. Cherry picking the point to compare to is bad.

enragedcacti

It's not cherry picking to use recent data to dispute a claim about recent events. YTD might not be the best choice but its better than 1Y. 1M or Feb 20th-now give similar though not as quite as extreme differences (∨7-8% SPY vs ∨20-23% NVDA).

mlinhares

I'm definitely betting the AI bubble is going to burst but NVIDIA isn't the company that will go down with it, they actually have a real business that is not just AI hype behind it. The insane valuation they have now might not hold but I doubt they are at any risk of disappearing.

danielcampos93

they also 10xed their revenue. 24 seems low for someone that pulled that off.

dwedge

Because of deepseek right? Saying it's the end of AI because of another AI is difficult to swallow

null

[deleted]

SubiculumCode

A highly unpredictable trade war, as well as rattling every international ally, convincing nations across the world to choose other military platforms than ours because Trump could just turn it off at a whim, increasing risk of political instability leading nations to turn to other nations for investment. Our economy is a ticking time bomb under Trump.

deadbabe

The stock market is not the economy. International allies purchasing decisions is not the economy.

cyberlurker

Which part of the economy are you taking inspiration from?

hnthrow90348765

>We may look back in a decade and lament how self-serving and short-sighted employers stopped hiring less experienced workers, denied them the opportunity to learn by doing, and thereby limited the future supply of experience developers.

I think bootcamps will bloom again and companies will hire people from there. The bootcamp pipeline is way faster than 4 year degrees and easy to spin up if the industry decides the dev pipeline needs more juniors. Most businesses don't need CompSci degrees for the implementation work because it's mostly CRUD apps, so the degree is often a signal of intellect.

This model has a few advantages to employers (provided the bootcamps aren't being predatory) like ISAs and referrals. Bootcamp reputations probably need some work though.

What I think will go away is the bootstraps idea that you can self-teach and do projects by yourself and cold-apply to junior positions and expect an interview on merit alone. You'll need to network to get an 'in' at a company, but that can be slow. Or do visible open source work which is also slow.

ike2792

The problem with bootcamps right now is that they provide no predictive value. If I hire someone with a CS degree from, say, Stanford that has 2-3 internships and a few semester-long projects under their belt, that gives me reasonable confidence as a manager that the person has what it takes to solve problems with software and communicate well. Bootcamp candidate resumes are all basically identical and the projects are so heavily coached and curated that it is difficult to figure out how much the candidate actually knows.

hnthrow90348765

In this hypothetical scenario from the article, it's been years since employers stopped hiring juniors, so depending on when they graduate, there's probably employment gaps or unrelated work to factor into your decision as well.

And after this period, when companies start hiring juniors again, the amount of Stanford-like graduates may still be small because few wanted to go into CS. You have like a 2-4 year wait for people deciding to go into CS again.

If you are FAANG, you can throw money at the problem to get the best, but ordinary businesses probably won't be able to get Stanford grads during a junior-hiring boom.

aggie

Most people do not have elite resumes and most people are not hiring people with elite resumes. There's plenty of uncertainty in hiring in general, and that being the case with bootcamps isn't much different than a typical resume with a 4-year degree.

shortstuffsushi

I agree with the position that most people are not coming from "elite" schools, as someone who hires in the midwest. I still much prefer someone with a four year school (and, as the other poster mentioned, internships) to a bootcamp. I have had one bootcamp graduate of five total that was at a useful starting skill level, compared to probably 90% (don't have a count for this one) "base useful" skill out of college.

ike2792

I used Stanford as an example, but plenty of companies focus on CS grads from big state schools like Purdue, Michigan, Ohio State, etc that have similar resumes. In my experience, graduates from 4-year CS programs with some internship experience vastly outperform bootcamp grads as a group. I have hired and worked with some outstanding bootcamp grads, but you would never know that they stood out before actually interviewing them since most bootcamps have standard resume templates they tell their grads to use. In an era of 200+ applicants/day for every junior engineering role, you need to be able to tell that someone probably has what it takes to succeed after a 30 second resume scan.

jsight

If the average person has still not ridden a self driving car, assembled by figure 02 style robots, though a drive thru with AI ordering, then we aren't even close to seeing the real peak here.

>100x growth ahead for sure.

hylaride

Most people (in the world) hadn't been on the internet in 2000 when the dot-com crash happened. Barely half the US population was even online at that point. We're probably nowhere near the peak of AI ability or usage, but that doesn't mean there hasn't been a lot of mal-investment or that things can't commodify.

Huge amounts internet growth still happened after the 2000 crash, but networking gear and fiber optic networking became a commodity play, meaning the ROI shifted. The companies that survived ended up piggybacking the over-investment on the cheap, including Amazon and Google.

Even going way back, the real productive growth of the American railroads didn't happen until after the panic of 1873 after overbuilding was rationalized.

jsight

Agreed, and good reminder that a lot of people here have probably only learned about the dot-com crash from comments and history. I remember when Cisco had a P/E in the hundreds during that era. People have forgotten just how stratospheric some of those valuations were back then.

I hate to say "this time is different", but it really doesn't feel the same way, at least in public equities. nvidia has a high stock price, but they also have a P/E of ~36. Meanwhile, modern Cisco is ~27.

There might be some parallels though. OpenAI as modern Netscape might not be that far off.

rco8786

Author not claiming AI has peaked. Only claiming that the hype as peaked.

jsight

I came close to adding a paragraph about that. Inevitably someone would argue that there's a difference between hype peaking and 100x future growth.

Regardless of whether that distinction is useful, the author makes some fairly specific claims about inelasticity of demand, and seems only to lack confidence regarding the timing of Nvidia's fall not its inevitability.

I disagree with all of that.

vessenes

What’s crazy is we are very close to this right now. Esp if you count industrial robots - byd is almost totally autonomous production, I believe Tesla is close as well.

khrbrt

None of those cases are "generative" AI.

jsight

We can debate the definition of generate, but it doesn't seem important. A key claim was that nvidia's stock price decline is inevitable, with the only question being timing. Meanwhile all of these other use cases will drive demand anyway.

But honestly, even chat apps are nowhere near their peak. Hallucinations and fine tuning issues are holding that segment back. There's a lot of growth potential there too as confidence and training help to increase adoption.

siliconc0w

IMO Grok and 4.5 show the we've reached the end of reasonable pre-training scaling. We'll see how far we can get with RL in post-training but I suspect we're pretty close to maxed there and will start seeing diminishing returns. The rest is just inference efficiency, porting the gains to smaller models, and building the right app-layer infrastructure to take advantage of the technology.

I do think we're overbuilding on Nvidia and the CUDA moat isn't as big as people think, inference workloads will dominate, and purpose-built inference accelerators will be preferred in the next hardware-cycle.

cenobyte

Anyone who thinks the Hype has peaked is obviously too young to remember the dotcom bubble.

It will get so much worse before it starts to fade.

Infecting every commercial, movie plot, and article that you read.

I can still here the Yahoo yodel in my head from radio and TV commercials.

JTon

> Yahoo yodel

I wanted to hear this again. Leaving it here for the next person: https://www.youtube.com/watch?v=Fm5FE0x9eY0

zekenie

Idk I used Claude Code recently and revised all my estimates. Even if the models stop getting better today I think every product has years of runway before they incorporate these things effectively.

qoez

One thing I'd love to short is the idea that we're going to have a second AI winter. Lots of people predict it but I believe this time is actually a real step function innovation (and last time was caused by it being a very distant research project and money dried out because competition with the much more lucrative internet which was growing at the same time).

secretmark

Why would you need to short it? If you believe that is true just go long on AI stocks or buy calls on these companies

somewhereoutth

It will be the third major winter - there was one around 1974-1980 as well as 1987-2000.

I don't believe the previous 'summers' entailed quite the scale of [mal]investment that has occurred this time, so the impending winter will be correspondingly savage.

mordae

> has slackened modestly compared to late-2019 due to higher interest rates, the job market for less experienced developers seems positively dire.

Maybe in the US.

ypeterholmes

So Deep Research and the latest reasoning models don't deserve mention here? I wish there was accountability on the internet, so that people posting stuff like this can be held accountable a year from now.

_cs2017_

Skeptical as I am about the generative AI, the quality of this particular article (in terms of evidence provided, logic, insights, etc) is substantially lower than ChatGPT / Gemini DeepResearch can generate. If I was grading, I'd rate an average (unedited) AI DeepResearch report at 3/10, and the headline article at 1/10.