Are we repeating the telecoms crash with AI datacenters?
75 comments
·December 3, 2025nuc1e0n
Anon1096
Besides the fact that this article is obviously AI generated (and not even well, why is there mismatches in british/american english? I can only assume that the few parts in british english are the human author's writing or edits), yes "overutilization" is not a real thing. There is a level of utilization at every price point. If something is "overutilizated" that actually means it's just being offered at a low price, which is good for consumers. It's a nice scare word though and there's endless appetite at the moment for ai-doomer articles.
bobmcnamara
> why is there mismatches in british/american english
You sometimes see this with real live humans who have lived in multiple counties.
heliumtera
To be honest it doesn't feel manually edited.
Bullet points hell, a table that feels it came straight out of grok.
knollimar
By this logic loss leaders to drive out competition are good gor the consumer, no?
martinald
Author here, I mix up American and British English all the time. It's pretty common for us Brits to do that imo.
See also how all (?) Brits pronounce Gen Z in the American way (ie zee, not zed).
null
an0malous
> The article claims that AI services are currently over-utilised. Well isn't that because customers are being undercharged for services?
Absolutely, not only are most AI services free but even the paid portion is coming from executives mandating that their employees use AI services. It's a heavily distorted market.
treis
We're talking miraculous level of improvement for a SOA LLM to run on a phone without crushing battery life this decade.
People are missing the forest for the trees here. Being the go to consumer Gen AI is a trillion+ dollar business. How many 10s of billions you waste on building unnecessary data centers is a rounding error. The important number is your odds of becoming that default provider in the minds of consumers.
bayarearefugee
> The important number is your odds of becoming that default provider in the minds of consumers.
I haven't seen any evidence that any Gen AI provider will be able to build a moat that allows for this.
Some are better than others at certain things over certain time periods, but they are all relatively interchangeable for most practical uses and the small differences are becoming less pronounced, not more.
I use LLMs fairly frequently now and I just bounce around between them to stay within their free tiers. Short of some actual large breakthrough I never need to commit to one, and I can take advantage of their own massive spends and wait it out a couple of years until I'm running a local model self-hosted with a cloudflare tunnel if I need to access it on my phone.
And yes, most people won't do that, but there will be a lot of opportunity for cheap providers to offer that as a service with some data center spend, but nowhere near the massive amounts OpenAI, Google, Meta, et al are burning now.
nuc1e0n
How big does an LLM need to be to support natural language queries with RAG?
mattnewton
My hot (maybe just warm these days) take is, the problem with voice assistants on phones is they have to be able to have reasonable responses to a long tail or users will learn not to use them, since the use cases aren’t discoverable and the primarily value is talking to it like a person.
So voice assistants backed by very large LLMs over the network are going to win even if we solve the (substantial) battery usage issue.
mNovak
Of all the players, I'd argue Google certainly knows how to give away a product for free and still make money.
The local open source argument doesn't hold water for me -- why does anyone buy Windows, Dropbox, etc when there's free alternatives?
steveBK123
Yes, it seems to me their strategy is to watch the OpenAI, Anthropics, etc of the world bleed themselves to death.
btilly
Yes, over-utilization is a natural response to being undercharged. And being undercharged is a natural result when investors are throwing money at you. During bubbles, Silicon Valley often goes to "lose money, make it up with scale". With the vague idea that after you get to scale, THEN you can figure out how to make money. And fairly consistently, their idea for how to make money is "sell ads".
Past successes like Google encourage hope in this strategy. Sure, it mostly doesn't work. Most of of everything that VCs do doesn't work. Returns follow a power law, and a handful of successes in the tail drive the whole portfolio.
The key problem here doesn't lie in the fact that this strategy is being pursued. The key problem is that it is rare for first mover advantages to last with new technologies. That's why Netscape and Yahoo! aren't among the FAANGs today. The long-term wins go to whoever successfully create a sufficient moat for themselves to protect lasting excess returns. And the capabilities of each generation of AI leapfrogs the last so well that nobody has figured out how to create such a moat.
Today, 3 years after launching the first LLM chatbot, OpenAI is nowhere near as dominant as Netscape was in late 1997, 3 years after launching Netscape Navigator. I see no reason to expect that 30 years from now OpenAI will be any more dominant than Netscape is today.
Right now companies are pouring money into their candidates to win the AI race. But if the history of browsers repeats itself, the company that wins in the long-term would launch in about a year from now, focused on applications on top of AI. And its entrant into the AI wars wouldn't get launched until a decade after that! (Yes, that is the right timeline for the launch of Google, and Google's launch of Chrome.)
Investing in silicon valley is like buying a positive EV lottery ticket. An awful lot of people are going to be reminded the hard way that it is wiser to buy a lot of lottery tickets, than it is to sink a fortune into a single big one.
rhetocj23
[dead]
aitchnyu
Will the OpenRouter marketplace of M clouds X N models die if the investor money stops? I believe its a free and profitable service, offered completely pay as you go.
iambateman
The thing that makes AI investment hard to reason about for individuals is that our expectations are mostly driven by a single person’s usage, just like many of the numbers reported in the article.
But the AI providers are betting, correctly in my opinion, that many companies will find uses for LLM’s which are in the trillions of tokens per day.
Think less of “a bunch of people want to get recipe ideas.”
Think more of “a pharma lab wants to explore all possible interactions for a particular drug” or “an airline wants its front-line customer service fully managed by LLM.”
It’s unusual that individuals and industry get access to basically similar tools at the same time, but we should think of tools like ChatGPT and similar as “foot in the door” products which create appetite and room to explore exponentially larger token use in industry.
malfist
But if I'm a pharma lab, I don't want to rely on a statistical engine that makes mistakes to answer those questions, I want to query a database that is deterministic.
gradus_ad
When I'm building out a new feature, I can churn through millions of tokens in Claude code. And that's just me... Now think about Claude code but integrated with Excel or datadog, or whatever app could be improved through LLM integration. Think about the millions of office workers, beyond just software engineers, who will be running hundreds of thousands or millions of tokens per day through these tools.
Let's estimate 200 million office workers globally as TAM running an average of 250k tokens. That's 50 trillion tokens DAILY. Not sure what model provider profit per token is, but let's say it's .001 cents.
Thats $500M per day in profit.
not_the_fda
Currently there is no profit per token, quite a bit of loss per token, that's the problem. Your not going to make it up in volume.
epistasis
> Think more of “a pharma lab wants to explore all possible interactions for a particular drug”
Pharma does not trust OpenAI with their data, and they don't work on tokens for any of the protein or chemical modeling.
There will undoubtedly be tons of deep nets used by pharma, with many $1-10k buys replacing more expensive physical assays, but it won't be through OpenAI, and it won't be as big as a consumer business.
Of course there may be other new markets opened up but current pharma is not big enough to move the needle in a major way for a company with an OpenAI valuation.
s_ting765
> “an airline wants its front-line customer service fully managed by LLM.”
This has been experimented on before by many companies over the recent years, most notably Klarna which was among the earliest guinea pigs for it and had to later on backtrack on this "novel" idea when the results came out.
dust42
OpenAI has 800,000,000 weekly users but only 20,000,000 are paying while 780,000,000 are free riding. Should they by accident under provision then they could simply remove the freebee and raise the prices for the paying clients. But that is not what they want.
IMHO the investors are betting on a winner-takes-it-all market and that some magic AGI will be coming out of OpenAI or Anthropic.
The questions are:
How much money can they make by integrating advertising and/or selling user profiles?
What is the model competition going to be?
What is the future AI hardware going to be - TPUs, ASICs?
Will more people have powerful laptops/desktops to run a mid-sized models locally and be happy with it?
The internet didn't stop after the dotcom crash and the AI wont stop either should there be a market correction.
alex_c
>OpenAI has 800,000,000 weekly users but only 20,000,000 are paying while 780,000,000 are free riding.
By itself, this doesn't tell us much.
The more interesting metric would be token use comparison across free users, paid users, API use, and Azure/Bedrock.
I'm not sure if these numbers are available anywhere. It's very possible B2B use could be a much bigger market than direct B2C (and the free users are currently providing value in terms of training data).
null
mwkaufma
Simultaneous claims that 'agentic' models are dramatically less efficient, but also forecasts efficiency improvements? We're in full-on tea-leaves-reading mode.
paulorlando
The 2001 telecoms crash drove benefits for companies that came later in the availability of inexpensive dark fiber after the bubble popped. WorldCom, ICG, Williams sold off to Verizon, Level 3, Teleglobe, and others. That in turn helped future Internet companies gain access to plentiful and inexpensive bandwidth. Cable telephony companies such as Cablevision Systems, Comcast, Cox Communications, and Time Warner, used the existing coaxial connections into the home to launch voice services.
epistasis
This is indeed true, but doesn't fiber have a far longer lifetime than GPU heavy data centers? The major cost center is the hardware, which has a fairly short shelf life.
gretch
Well you still get the establishment of 1) large industrial buildings 2) water/electricity distribution 3) trained employees who know how to manage a data center
Even if all of the GPUs inside burn out and you want to put something else entirely inside of the building, that's all still ready to go.
Although there is the possibility they all become dilapidated buildings, like abandoned factories
epistasis
The building and electrical infrastructure are far cheaper than the hardware. So much so that the electricity is a small cost of the data center build out, but a major cost for the grid.
Of the most valuable part is quickly depreciating and goes unused within the first few years, it won't have a chance for long term value like fiber. If data centers become, I don't know, battery grid storage, it will be very very expensive grid storage.
Which is to say that while there was an early salivation for fiber that was eventually useful, overallocation of capital to GPUs goes to pure waste.
gmm1990
Some of the utilization comparisons are interesting, but the article says 2 trillion was spent on laying fiber but that seems suspicious.
observationist
There's an enormous amount of unused, abandoned fiber. All sorts of fiber was run to last mile locations, across most cities in the US, and a shocking amount effectively got abandoned in the frenzy of mergers and acquisitions. 2 trillion seems like a reasonable estimate.
Giant telecoms bought big regional telecoms which came about from local telecoms merging and acquiring other local telecoms. A whole bunch of them were construction companies that rode the wave, put in resources to run dark fiber all over the place. Local energy companies and the like sometimes participated.
There were no standard ways of documenting runs, and it was beneficial to keep things relatively secret, since if you could provide fiber capabilities in a key region, but your competition was rolling out DSL and investing lots of money, you could pounce and make them waste resources, and so on. This led to enormous waste and fraud, and we're now on the outer edge of usability for most of the fiber that was laid - 29-30 years after it was run, most of it will never be used, or ever have been used.
The 90s and early 2000's were nuts.
sosodev
I so desperately wish it weren't abandoned. I hate that it's almost 2026 and I still can't get a fiber connection to my apartment in a dense part of San Diego. I've moved several times throughout the years and it has never been an option despite the fact that it always seems to be "in the neighborhood".
Spooky23
That has nothing to do with fiber, it’s all about politics and a regulatory environment where nobody is incented to act. Basically, the states can’t fully regulate internet and the Federal government only wants to fund buildouts on a pork barrel basis. Most recently rural.
At the local level, there is generally a cable provider with existing rights of way. To get a fiber provider, there’s 4 possible outcomes: universal service with subsidy (funded by direct subsidy), cherry-picked service (they install where convenient), universal service (capitalized by the telco) and “fuck you”, where they refuse to operate. (ie. Verizon in urban areas)
The private capitalized card was played out by cable operators in the 80s (they were innovators then, and AT&T was just broken up and in chaos). They have franchise agreements whose exclusivity was used as loan collateral.
Forget about San Diego, there are neighborhoods in Manhattan with the highest population density in the country where Verizon claims it’s unprofitable to operate.
I served on a city commission where the mayor and county were very interested in getting our city wired, especially as legacy telco services are on the way out and cable costs are escalating and will accelerate as the merger agreement that formed Spectrum expires. The idea was to capitalize last mile with public funds and create an authority that operated both the urban network and the rural broadband in the county funded by the Federal legislation. With the capital raised with grants and low cost bonding (public authority bonds are cheap and backed by revenue and other assets), it would raise a moderate amount of income in <10 years.
We had the ability to get the financing in place, but we would have needed legislation passed to get access to rights of way. Utilities have lots of ancient rights and laws that make disruption difficult. The politicians behind it turned over before that could be changed.
photochemsyn
For infrastructure, central planning and state-run systems make a lot of sense - this after all is how the USA's interstate highway system was built. The important caveat is that system components and necessary tools should be provided by the competitive private sector through transparent bidding processes - eg, you don't have state-run factories for making switches, fiber cable, road graders, steel rebar, etc. There are all kinds of debatable issues, eg should system maintenance be contracted out to specialized providers, or kept in-house, etc.
advisedwang
The GDP 1995-2000 (inclusive) was about $52T. So that assertion would mean that about %3.8 of the US' economic activity was laying fiber. That seems like a lot, but in my ignorance doesn't sound totally impossible.
recursive4
Stylistically, this smells like it was copy and pasted from straight out Deep Research. Substantively, I could use additional emphasis on the mismatch between expectations and reality with regards to telco debt-repayment schedule.
asplake
Yes or no conclusions aside (and despite its title, the article deserves better than that), the key point is I think this one: “But unlike telecoms, that overcapacity would likely get absorbed.”
lazide
Telecom (dark fiber) capacity got absorbed too. Eventually. After a ton of bankruptcies.
Havoc
Don’t think looking at power consumption of b200s is a good measure of anything. Could well be an indication of higher density rather than hitting limits and cranking voltage to compensate
jsight
Yes, one of NVidia's selling points for the b200 is that performance per watt is better than before. High power consumption without controlling for performance means nothing.
kqr
Is there a way in which this is good for a segment of consumers? When the current gen of GPUs are too old, will the market be flooded with cheap GPUs that benefit researchers and hobbyists who therwis would not afford them?
LogicFailsMe
GPUs age surprisingly gracefully. If a GPU isn't cutting edge, you just tie two or more of them together for a bit more power consumption to get more or less the same result as the next generation GPU.
if there's ever a glut in GPUs that formula might change but it sure hasn't happened yet. Also, people deeply underestimate how long it would take a competing technology to displace them. It took GPUs nearly a decade and the fortunate occurrence of the AI boom to displace CPUs in the first place despite bountiful evidence in HPC that they were already a big deal.
stego-tech
Unlikely, for a few reasons:
* The GPUs in use in data centers typically aren’t built for consumer workloads, power systems, or enclosures.
* Data Centers often shred their hardware for security purposes, to ensure any residual data is definitively destroyed
* Tax incentives and corporate structures make it cheaper/more profitable to write-off the kit entirely via disposal than attempt to sell it after the fact or run it at a discount to recoup some costs
* The Hyperscalers will have use for the kit inside even if AI goes bust, especially the CPUs, memory, and storage for added capacity
That’s my read, anyway. They learned a lot from the telecoms crash and adjusted business models accordingly to protect themselves in the event of a bubble crash.
We will not benefit from this failure, but they will benefit regardless of its success.
ares623
Some of them will probably be starving, homeless, or bedridden by the time that happens but yes they can get cheap GPUs
wmf
Many researchers and hobbyists cannot even plug in a 10 KW 8 GPU DGX server.
quickthrowman
Why not? It’s 40A at 240V, or 25% of the continuous load rating of a 200A 240V single-phase service.
If someone can afford an 8 GPU server, they should be able to afford some #6 wire, a 50A 2P breaker, and a 50A receptacle. It has the same exact power requirements as an L2 EV charger.
knollimar
You probably run a bigger (perhaps double) neutral and care about a stronger ground. But yeah, the $12 is rounding error at this scale
CamperBob2
That doesn't exactly bode well for the EV revolution, then, does it?
LogicFailsMe
The average commute in the United States is about 24 miles a day round trip. That's about 10 kWH. That's enough to charge overnight on a 15A circuit.
BuffaloEric33
Almost all home EV charging is <=10kW.
tekno45
wut?
turtlesdown11
Amazing article, I found it fascinating.
> You can already use Claude Code for non engineering tasks in professional services and get very impressive results without any industry specific modifications
After clicking on the link, and finding that Claude Code failed to accurately answer the single example tax question given, very impressive results! After all, why pay a professional to get something right when you can use Claude Code to get it wrong?
venturecruelty
No, because at least dark fiber is useful. AI GPUs will be shipped off to developing nations to be dissolved for rare earth metals once the third act of this clown show is over.
The article claims that AI services are currently over-utilised. Well isn't that because customers are being undercharged for services? A car when in neutral will rev up easily if the accelerator pedal is pushed even very slightly, because there's no load on the engine. But in gear the same engine will rev up much less when the accelerator is pushed the same amount. Will there be the same overutilisation occurring if users have to financially support the infrastructure, either through subscriptions or intrusive advertising?
I doubt it.
And what if the technology to locally run these systems without reliance on the cloud becomes commonplace, as it now is with open source models? The expensive part is in the training of these models more than the inference.